• Research article
  • Open access
  • Published: 29 April 2020

A balancing act: a window into online student engagement experiences

  • Orna Farrell   ORCID: orcid.org/0000-0001-9519-2380 1 &
  • James Brunton   ORCID: orcid.org/0000-0001-7223-0524 1  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  25 ( 2020 ) Cite this article

55k Accesses

85 Citations

32 Altmetric

Metrics details

This article reports on a qualitative study which explored online student engagement experiences in a higher education institution. There are very few studies providing in-depth perspectives on the engagement experiences of online students. The project adopted a case study approach, following 24 online students over one academic year. The setting for the study was an undergraduate online Humanities programme at Dublin City University. The research question for the study was: What themes are central to online student engagement experiences? Data was collected from participant-generated learning portfolios and semi-structured interviews and analysed following a data-led thematic approach. The five central themes that make up the study’s findings highlight key issues of students’ sense of community, their support networks, balancing study with life, confidence, and their learning approaches. The findings of this study indicate that successful online student engagement was influenced by a number of psychosocial factors such as peer community, an engaging online teacher, and confidence and by structural factors such as lifeload and course design. One limitation of the study is that it is a relatively small, qualitative study, its findings provide insights into how online degrees can support online students to achieve successful and engaging learning experiences.

Introduction

This study set out to explore the central themes relating to online student engagement experiences. A case study approach was adopted, following 24 online students over one academic year as they studied towards a BA in Humanities degree through DCU Connected, in Dublin City University (DCU). DCU Connected delivers flexible, undergraduate and postgraduate programmes through the mode of online learning and aims to afford educational opportunities to students who have not managed to access more traditional entry routes into higher education.

Online learning is the fastest growing areas of education worldwide because it provides access to educational opportunities in a flexible manner to students from diverse backgrounds and geographical regions who often can’t access higher education by other means (Delaney & Fox, 2013 ; Roll, Russell, & Gašević, 2018 ). In Ireland, online higher education courses are less prevalent than in other jurisdictions such as Australia or the United States, but the number of students enrolling in online higher education courses in Ireland is growing, which is evident from the increase of online-only government sponsored Springboard degree courses, which grew from 10% of courses in 2011 to 15% in 2016 (HEA, 2016 ; HEA, 2018 ). However, online students have been shown to be more vulnerable to attrition, with online degree programmes having lower rates of completion in comparison with traditional ones (Woodley & Simpson, 2014 ). Online students’ lower completion rates may be attributed to issues with time management and lifeload, unrealistic expectations, a sense of isolation and a perception that they are less valued by the institutional culture (Brown, Hughes, Keppell, Hard, & Smith, 2015 ; Mallman & Lee, 2016 ; Nichols, 2011 ; O’ Shea, Stone, & Delahunty, 2015 ). The reasons underlying online student non-completion are a complex set of factors which encompass student engagement and success, and it is therefore important that needs of online students are better understood to facilitate their success and engagement in higher education (Brunton, Brown, Costello, & Farrell, 2018 ; Kahu & Nelson, 2018 ).

There is very little literature about the experiences of online students in the Irish higher education (HE) context, with the majority of the relevant literature based in Australasia. While the experiences of campus based undergraduate students have been thoroughly explored, “the experiences of online students has been somewhat ignored in the literature” (O’ Shea et al., 2015 , p. 57). The current study focused on the experiences of online students in the Irish higher education context, aiming to improve our understanding of a cohort of students that is under researched in the Irish context.

Contexts from the literature

This section presents contexts from the literature about online student success, study habits and engagement. Student engagement can be defined as “a student’s emotional, behavioural and cognitive connection to their study” which has a direct impact on student success and achievement (Kahu, Stephens, Zepke, & Leach, 2014 , p. 523). As online degree programmes have lower rates of retention and graduation than campus based undergraduate courses, it is important that a greater understanding of student engagement in the online context is developed (Woodley & Simpson, 2014 ).

There are a number of interlinked factors reported in the literature which affect online student experiences and retention: time management skills; the ability to balance work, family, etc. with study; autonomy; community; sense of belonging; motivation; course design; and support structures at institutional, programme and teacher levels (Blackmon & Major, 2012 ; Brown et al., 2015 ; Buck, 2016 ; Holder, 2007 ; Zembylas, Theodorou, & Pavlakis, 2008 ).

The following sections present contexts from the literature about online student success, study habits and engagement through the lens of Kahu’s ( 2013 ) holistic conceptual framework of student engagement. Kahu’s ( 2013 ) framework has the student at the centre interacting: the social-cultural context, structural and psycho-social influences, engagement and the proximal and distal consequences, see Fig.  1 below.

figure 1

Conceptual framework of engagement, antecedents and consequences (Kahu, 2013 , p. 766). Reproduced with permission

In this study, there is a particular focus on the factors which are most relevant to online students, as “when shifting to online contexts, engagement takes on different manifestations, due to the lack of face to face contact and the ways in which teaching and learning are mediated through technology” (O’ Shea et al., 2015 , p. 43).

Socio-cultural influences

Recent high level international and Irish policy reports have emphasised the importance of lifelong learning and bringing more adult learners into higher education, through the provision of flexible study options such as online or part-time programmes (European Commission, 2014 ; Department of Education & Skills, 2011 ). This strategy has led to a slow growth in the numbers of adult students participating in online higher education (HE) in Ireland. Online learning is more affordable as students can earn as they learn and travel costs are reduced. For these reasons, online learning is important in supporting access to HE for disadvantaged groups (Castaño-Muñoz, Colucci, & Smidt, 2018 ).

Although access to Irish higher education has increased for adult learners, social inequalities continue to be reproduced, and some groups remain under-represented (Delaney & Farren, 2016 ). Typically online students in Ireland are older and from lower socio-economic backgrounds, some are upskilling, and many are second chance learners, and there is an intersectionality between these identity categories (Brunton et al., 2018 ; Delaney & Brown, 2018 ). Irish online students may have delayed participation in university education for reasons relating to social class (Delaney & Farren, 2016 ).

Structural influences

For online students, structural influences such as course design significantly impact on their learning experiences. Online student engagement can be supported by a well-designed course which promotes interaction and social presence and creates a clear, purposeful learning journey; efficient use of students’ limited time; linking learning activities to goals; building on existing understanding whilst addressing gaps in understandings; providing immersive, real-world simulations or experiences (Buck, 2016 ; Frey, 2015 ). Inappropriately designed online courses and delivery can negatively impact on online student engagement (Stone & O’Shea, 2019 ).

In Ireland, HE institutional supports such as library, career advice, learning support, administration and counselling services are heavily focused on full time on campus students. This means that online students have reduced access or sometimes no access to vital university supports (Delaney & Farren, 2016 ; HEA, 2012 ). This can lead to online students feeling less integrated and engaged with the institution and feeling that they are a lower priority than campus based students (HEA, 2012 ; O’ Shea et al., 2015 ; Yang, Baldwin, & Snelson, 2017 ). This can be seen as related to the barriers that exist for adult learners where the institution primarily serves, and is structured for younger students. Such barriers will impact on a sense of belonging as adult learners negotiate an institutional understanding of learners that runs contrary to their needs, experiences, and ways of being (Fairchild, 2003 ).

Online students tend to have many demands on their time; the very reasons which cause them to choose this study mode can in turn cause them to withdraw (Simpson, 2004 ). Many online students struggle to follow a regular study schedule due to the challenges of balancing work, family and study (Blackmon & Major, 2012 ; Brown et al., 2015 ; Buck, 2016 ; Zembylas et al., 2008 ). Trying to fulfil multiple roles and juggle professional, family, social life, and study can cause online students to feel considerable stress (Brown et al., 2015 ; Stone & O’Shea, 2019 ; Zembylas et al., 2008 ). Kahu ( 2013 , p. 767) describes this as lifeload which “is the sum of all the pressures a student has in their life, including university”, and is described as being a critical factor influencing student engagement. For online students, particularly those with caring responsibilities, support from family and friends is key to successful engagement enabling them to have time and space to study (Kahu et al., 2014 ; McGivney, 2004 ).

Psychosocial influences

Online student engagement is affected by a number of interrelated psychosocial influences such as teaching support, motivation, skills and self-efficacy.

Teaching support plays a critical role in online courses, with teacher engagement and connection having a positive effect on online student retention (Stone & O’Shea, 2019 ). Effective online teachers support their students through timely, proactive, embedded support which establishes their personal presence and actively engages students through synchronous and asynchronous methods (Rose Sr., 2018 ; Stone & O’Shea, 2019 ). Further, online student self-efficacy is a predictor of success. Kahu, Picton, & Nelson, 2019 ) found that student self-efficacy influenced interest and enjoyment, and behavioural engagement with learning.

According to the literature, there are a number of key skills which contribute to successful study online such as organisation, time management, study skills and digital competencies (Andrews & Tynan, 2012 ; Brown et al., 2015 ; Buck, 2016 ). Online learning is facilitated through digital technology and internet access. For online students having the necessary digital skills to comfortably and competently engage with the technological aspects of online learning can be a barrier to successful engagement with their online programmes (Brown et al., 2015 ; O’ Shea et al., 2015 ). In particular, new online students struggle with the online learning environment and may need time and support to sufficiently orientate themselves (Stone & O’Shea, 2019 ; Yoo & Huang, 2013 ).

Students with more developed time management skills are more likely to continue on an online course (Holder, 2007 ). This involves establishing a sustainable study routine which can adapt and account for problems (Brown et al., 2015 ; Kahu et al., 2014 ). In addition to time management, strong organisational skills and the ability to keep on task are key to being a successful online learner (Buck, 2016 ). Creating a positive study environment with a dedicated and quiet study space is an important organisational aspect for online students (Buck, 2016 ; Çakıroğlu, 2014 ). A further organisational aspect is the necessity to plan and structure their studies around their other responsibilities effectively, this can result in unusual study patterns which are highly individual such as studying late at night or early in the morning (Andrews & Tynan, 2012 ; Buck, 2016 ).

  • Student engagement

In Kahu’s ( 2013 ) framework, student engagement is influenced by the socio-cultural, structural, psychosocial factors discussed above. Further, online student engagement is particularly influenced by a sense of belonging. “Online learners, perhaps more so than face-to-face learners, need deliberately orchestrated, multiple opportunities to engage with others so that expression, development, tolerance and recognition of their diverse identities may in part compensate for any lack felt by not having a physical presence.” (Delahunty, Verenikina, & Jones, 2014 ).

Retention of online students can be facilitated by a strong sense of belonging to the institutional, programme and module community (Bowles & Brindle, 2017 ; Farrell & Seery 2019 ; Stone & O’Shea, 2019 ). Feeling that they belong to a community of learners has a significant impact on the learning experiences of online students (Buck, 2016 ; O’ Shea et al., 2015 ). The two factors that can support the development of a sense of community and belonging in students are establishing social presence and high levels of interaction in the course (Buck, 2016 ; Veletsianos & Navarrete, 2012 ). Developing social presence in the course gives students a greater sense of connection to each other, the teacher and the course (Veletsianos & Navarrete, 2012 ). Interaction and social presence can be promoted through course design which promotes active communication between students and instructors using asynchronous discussion forums and synchronous online classes (Buck, 2016 ). Community can also be fostered through informal student interaction such as social media, study groups, and email (O’ Shea et al., 2015 ). Andrews and Tynan ( 2012 ) found that informal student networks were most beneficial for participants in terms of sense of community. Informal student networks can enable online students to form positive social relationships and close ties with fellow students (Zembylas et al., 2008 ). The emphasis in the literature on building community is in response to the feelings of isolation often experienced by online students (Bolliger & Shepherd, 2010 ). Fostering a strong sense of community among students in online courses and establishing social presence can decrease students’ feelings of isolation and disconnection (Phirangee & Malec, 2017 ).

Engagement-disengagement outcomes

The outcomes or proximal/distal consequences of online student engagement are a positive learning experience, course completion and a sense of satisfaction (Kahu, 2013 ; Kahu et al., 2019 ; O’ Shea et al., 2015 ). The outcomes of online student disengagement are non-completion, withdrawal, and unsatisfactory learning experience (Kahu, 2013 ; Kahu et al., 2019 ; O’ Shea et al., 2015 ). The majority of students who withdraw do not return to study, emphasising the importance of targeted student success and engagement supports early in the study lifecycle (Brunton et al., 2018 ; Woodley & Simpson, 2014 ).

In summary the literature highlights the importance of viewing online student engagement through the lens of socio-cultural, structural, and psychosocial influences. There is a gap in the research on the experiences of online students in Irish higher education, with the majority of the relevant literature based in Australasia, which identifies a gap for studies such as this can address.

Our approach to online student engagement

The DCU Connected approach to online student engagement comprises of a commitment to open access admission, a strategic approach to transition, flexible progression routes, a layered support system, a focus on student skills development, and a programmatic approach to online learning and assessment design, see Table  1 below.

Methodology

A qualitative case study grounded in the constructivist paradigm was designed with the aim of exploring online student engagement experiences. The present study was framed by the following research question:

What themes are central to online student engagement experiences?

The setting for the study was an undergraduate sociology module on the BA (Hons) in Humanities, an online programme delivered through DCU Connected at Dublin City University.

Ethical approval for the study was granted by the institutional research ethics committee. Purposive sampling was used to select participants for the case study. Twenty-four online, adult students gave written consent to take part in the study in the academic year 2016–2017. The cohort comprised of seven males and 17 females, participants’ ages ranged from 21 to 63, with an average age of 39 years old, participants were geographically distributed around Ireland.

Data was collected in two ways: two semi-structured interviews per participant conducted mid-way and at the end of the academic year; and participant-generated learning portfolio entries relating to their learning experience in the sociology module. The interviews were conducted in real time online using a private Adobe Connect classroom. Participants completed five learning portfolio entries, approximately one per month over the academic year. In the learning portfolio entries, participants included visual and written reflections.

As insider researchers, who work in DCU, this has limitations for the study, as issues of power and bias can emerge (Sikes & Potts, 2008 ). The issue of power was dealt through the use of institutional gatekeepers to access the study participants.

An iterative model of data collection and analysis was adopted. The analysis followed a data-led thematic analysis approach based on Braun and Clarke’s ( 2006 ) six phases and comprised of several cycles of coding, theme generation, refining and reviewing themes. After a number of iterations of reviewing and refining, a thematic diagram was created and each theme was defined and named. Themes constructed through the analytical process are discussed in the findings below.

The findings presented below encapsulate the experiences of online students engaging with an undergraduate sociology module. Five themes were constructed during the analytical process, they were as follows: peer community, module supports, studying while balancing life commitments, confidence and my approach to learning. The final five themes created through the iterative analytic process, resulting in the final thematic map are presented in Fig.  2 below.

figure 2

Final Thematic Map: illustrating five themes constructed in the analytic process

Peer community

In their narratives, the role of their peers as part of their learning experience in the module and as a source of support was valued. In the module discussion forums and online tutorials, discussion and debate with classmates enhanced student learning experiences and deepened their understanding of sociology.

“I feel I am fortunate enough to be in a group with classmates that encourage and engage in debates around various topics in sociology, whether it be in class, between breaks or through the online forum” (Participant 12, entry 1).

For participant 17, group discussions greatly facilitated his comprehension of the module concepts and content, “ Listening to fellow students speak about their thoughts and ideas about the various sociological institutions, such as education and religion made me realise that my knowledge of sociology is similar to others” (Participant 17, entry 1).

The students formed informal study groups organically, which met face to face, online and on Whatsapp, theyvalued the support, reassurance and sense of community offered by peers in these informal study groups.

I: it’s great to hear what other people are doing as well because you know yourself studying online can be very isolating. P5: Yeah, thank god for the Whatsapp groups. I: Yeah well I have heard that they are very helpful all right. P5: Yeah they really are when you get stuck you can just get on and say ‘Lads I am fucking stuck’ and someone will say ‘What are you stuck on, what can you not get your head around?’ And it is great for that. And also for reminding people when we have online. I: Assignments due and stuff. P5: It is just really good for that so you don’t feel like you are on your own. (Participant 5, interview 1).

Both the formal and informal communities formed by the students and tutors in the module engendered a sense of belonging to the programme and was an integral part of their approach to learning in the module.

“Yeah it’s brilliant I have become friends with a few people in the course and some of them are over from different countries doing exams so we are thinking of going for drinks, we stay in touch.” (Participant 10, interview 2).

Although many of the participants described feeling part of the class community, for two participants, studying online was an isolating experience.

“It is hard working alone and I do find that I can get a bit lost as there is not much direction from DCU as to what stage I should be at with my reading etc. This was brought up at the last tutorial but there has been no correspondence since. I think since my last post I have become a bit frustrated and have actually struggled even though I have organised myself a bit better. This sounds a bit contradictory but I think the lack of interaction with anyone is causing this.” (Participant 1, entry 2).

Module supports

A high value on the support offered by their module tutors both in synchronous online sessions and asynchronous discussion forums was reported. Attending and participating in tutorials was described as fundamental to learning, socialization and progression in the module. Tutorials provided reassurance, interaction with peers and clarification of difficult concepts and theories.

“However, since then I have completed the postings for Assignment two and have attended another online tutorial. The tutorial covered the codes, conventions, theories and perspectives of Social Order. As I had been studying these subjects in the unit notes, it was very useful to have a structured discussion on them. When our tutor gives real life examples of the application of these, it makes everything easier to understand and remember. I thought that tutorial was particularly useful, as there was good interaction among the group. We were in the middle of our postings for assignment 2 at that time. (Tutor name) gave us useful pointers for the assignment, among which was to try to focus our examples on Irish society.” (P8, eportfolio entry 4).

The importance and centrality of the support provided by the module tutors was very clearly articulated in the data. The reassuring and supporting role of the tutor was very significant to the learning experience of participants in terms of clarification of concepts and assignments, encouragement, guidance on reading and approaches to study.

“One of my main difficulties in gathering my work for the first assignment, was my block on getting over what the definition of power is in sociology. When we had our first face to face tutorial with (tutor name) it made more sense and I was actually surprised at how much power was evident in everyday life, in our relationships with people and in our interactions with pretty much everyone.” (P5, entry 1).

Some participants described becoming more comfortable interacting on the online discussion forum, and perceived them to be a useful medium for interactive discussion and viewing other students’ questions and concerns on the class discussion forum, which alleviated feelings of doubt.

“I learned that I like the interaction of the online forum for learning and I found it interesting to read what other people had researched. I found I was reading with a focus of either agreeing or disagreeing with the points that people had made. I had to choose which posts to engage with. The type of work in forums and engaging with others is helping me with my critical thinking skills.” (P4 entry 4).

As a contrast, othersfelt nervous, exposed and disinclined to post on the discussion forum.

I: So I suppose the things that stand out to me are your pieces of evidence. Do you want to talk about them a little? P19: Oh I forgot I posted that. So one of them was my reply to (tutor name) I think it is. It was about my confidence in using the forum, and participating in that way. Because I think you’re hesitant, you don’t know how people are going to take you up on the forums. And it’s different from other bits of social media where you don’t care how people take you up. You just have to be yourself. If somebody takes me up poorly, I’m going to come across poorly to my tutor. I might foster some kind of bias. (P19, Interview 1).

This means that discussion forums, which are a key mechanism of support within the module, do not meet the support needs of some students.

Studying while balancing life commitments

The data indicated that the most challenging aspect of being an online student was studying while balancing work, family and caring responsibilities. This is very clearly articulated in participant narratives. Balancing competing demands while finding sufficient time to study, and write assignments, put participants under severe pressure. As the students are already time poor, issues such as illness had a domino effect on participants’ ability to keep on track with their studies. One or two unexpected problems in their personal lives could cause students to fall badly behind with their study and assignment work, thus impacting their learning experience.

“I’m beginning to worry that I won’t have time for a more in depth look at everything in this section, before having to move on to Crime and Deviance in advance of Assignment 2. It all comes down to time management, which I remember was an issue at this time last year. With all the extra pressures of Christmas from a work and family point of view the study can get squeezed. I may have to do less(no?) housework to facilitate my learning this month. This idea has not been negotiated with my partner and may have to be revised! Perhaps a self-imposed ban on TV for the month is a more acceptable strategy. However, all work and no play!” (P20, entry 2).

The issue of time management is very strongly articulated in the data: the pressure of finding sufficient study time; and the stress and worry they felt about falling behind was a persistent difficulty faced by participants throughout the academic year.

“My learning process is still haphazard and I struggle to block off sufficient time to study. I have dealt with this to some extent by spending longer hours in my office in work to catch up on my modules. The downside of this is I am available to work colleagues even though I am technically finished work and situations often arise that require my attention.” (P3, entry 3).

The majority of participants proactively planned their study, many created study schedules and task lists which they included in their eportfolio entries. For some participants this systematic approach to planning study was effective, and the establishing of regular study routines enabled them to cope with coursework. Participant 17 demonstrated his approach in his study plan for January, see Fig.  3 below.

figure 3

Study plan: P17 eportfolio entry 3

For other participants, despite good intentions about creating a regular study schedule, their challenges with time management persisted which continued to cause them considerable stress and anxiety.

“My learning process this year involved two strategies: firstly, snatching time whenever I could to get my college work done and secondly, descending into a blind panic at the last minute before having to submit work. These are not approaches that I would recommend or indeed intend to replicate as I go forward with this course.” (P7, eportfolio entry 5).

Expressions of self-doubt, fear, apprehension, uncertainty about their academic abilities and approaches to studying feature in participant narratives. This is evident in participant 15’s extract below:

“I am very happy to see that over the past few months my reading skills have vastly improved. Before I started my journey of third level education, my reading skills were below average at the best of time, I had a lack of confidence in myself and I could not abstract information from a text on the best of days.” (P15, eportfolio entry 3).

Some participants grew in confidence as the module progressed. Getting good assignment results and positive feedback validated their perceived abilities, and enabled them to overcome their feelings of uncertainty.

“Previously, I was unaware that self-doubt affected new writing challenges such as the SOC3A A1 article review. However, I was aware of a drive to learn and demonstrate knowledge of sociology to myself. I believe the grades validated my ability and I was no longer distracted or made anxious by self-doubt. Therefore, I have discovered growth in my confidence impacted my study habits and did not uncover contrary study habits.” (P19, eportfolio entry 3).

By proactively addressing and overcoming their perceived academic weaknesses, some participants gained confidence in their academic abilities and felt well prepared for future study.

I: And do you feel more ready for next year? P13: Totally. I mean, I think if you had asked at the start of the year how I would feel about third year I was just very overwhelmed, I was finding it so daunting and I feel like it is going to be fine, I really do. I feel like it is just going to be the same as this year with different assignment titles. I: So, your confidence has grown? P13: Oh, without a shadow of a doubt, yeah. I feel like I know how to study now, I know how to reference, I am organised, I have my diary for doing my timelines, I feel very ready. I: That is great, so a big jump. P13: Yeah. Big, big jump but I mean, I feel like it is progressing very well. (P13, interview 2).

My approach to learning

In their narratives, participants described their highly personal approaches to learning, this provides a detailed insight into their study techniques, when, where and how they learned. As the majority of study was self-directed, participants had to develop individual techniques to aid their understanding of the sociological content, theory and concepts. A variety of online student study approaches was detailed in the data. These techniques were varied, and innovative, describing traditional approaches such as reading, annotating and note taking and modern approaches to study using online resources such as YouTube videos, online lectures, podcasts, glossaries, online articles, and watching recordings of previous online tutorials.

“My learning process begins with reading the module unit text to get an overview of the topic. I then re-read the text more carefully, underlining and annotating. If I find I need clarification on a concept or theory I use the internet to find either a useful article or video. I then move on to the required reading.” (P18, entry 1).

Further, participant 18 provided the below image to evidence his learning approach, which he describes as “ I have included this photograph of my annotated notes on Talcott, Gramsci and Foucault as evidence of my engagement with the module text. You can see that I underline or circle key terms. I also scribble down thoughts or connections I make”, see Fig.  4 below.

figure 4

My learning process (P18, entry 1)

Study techniques were detailed in the data, including approaches to reading and varied personal approaches to note-taking. Participants included many visual examples of their note-taking approaches which were highly individual. This individual approach is evident in participant 21’s use of colour coding in her notes to aid revision, see Fig.  5 .

figure 5

A page of the notes I take showing the colour scheme (P21, entry 1)

“I have once again used the approach to study that I discovered most beneficial to me last year. I read through the topic that I am trying to learn. I highlight the information that I feel is beneficial and are the key points. I then add these key points to a note pad. I use different coloured pens for titles of topic’s and the actual content so that when I look back over it, it can be easily found.” (P21, entry 1).

In addition, participants reported studying in a wide range of places. They studied at home, in the library, on their phone while commuting to work, in cafes, in work. Participants were creative at carving out time and space to study, as evidenced in the quote below:

“I am a busy mother of five so... it would be difficult for me to get to classes so I find I work my study around the children. I keep books in my car and study while I wait at football practice and I love the fact that I can look over an online class later in the evening.” (P14, entry 1).

Many included images of their study spaces, for example participant 18 evidenced his study space in the image in Fig. 6 below, which he described as “the view from the window of my attic office” (P18, entry 5).

figure 6

The view from my attic office. (P18, eportfolio entry 5)

To summarise the findings section above, the key factors to the engagement and success of this particular cohort of online students were: community, time management and organisational skills, engaging and supportive online teachers, multiple means of interaction and opportunities to develop skills and build confidence.

The purpose of this study was to explore themes relating to online student engagement experiences. This section discusses the findings of the study in relation to Kahu’s ( 2013 ) holistic conceptual framework of student engagement, the research question and the existing literature.

The findings from this study indicate that being a successful online student was impacted by the structural influence of lifeload. It was found that the most challenging aspect of being an online learner was balancing one’s studies with other highly valued and time-consuming commitments, such as work, family and caring responsibilities. This is consistent with previous research which found that trying to fulfil multiple roles and juggle professional, family, social and study contexts can cause online distance students to experience considerable stress (Brown et al., 2015 ; Kahu et al., 2014 ; Zembylas et al., 2008 ).

A further structural influence on online student engagement in this study, was the design-led approach to use of discussion forums in this sociology module. Asynchronous discussion forums were used as a place to ask questions, seek support and interact with the module tutor and classmates. Discussion forums were used in conjunction with synchronous online seminars for the purpose of interacting with others in the module. In this study, there were mixed reactions from participants about the usefulness of discussion forums. For some participants, discussion forums were useful for interactive discussion and viewing other students’ questions and concerns. Other participants felt nervous, exposed and were disinclined to post on the discussion forums. This finding differs with the literature which indicates that active communication between students and instructors using asynchronous discussion forums promotes interaction and social presence (Buck, 2016 ; Gauvreau, Hurst, Cleveland-Innes, & Hawranik, 2016 ). This means that discussion forums which were a key mechanism of support in the module did not meet the support needs of some students.

A key psychosocial influence on online student engagement was found to come from interaction with the peer community, which engendered feelings of belonging and support. Participants placed a high value on the peer communities they formed over the course of the academic year. Three types of formal and informal peer communities were formed: the official institutional community; the student-generated and student-led class community; and smaller cohorts of student-generated study groups. These peer communities were perceived by participants to be an essential source of support, reassurance, encouragement and human connection. This finding is consistent with previous research on peer interaction in online courses carried out by Andrews and Tynan ( 2012 ), O’ Shea et al. ( 2015 ), Zembylas et al. ( 2008 ). Further, a proximal consequence of the strong peer community formed was an enhanced learning experience through discussion and debate in the module discussion forums. Despite the presence of a strong class community within the cohort, the findings from this study suggest that for a small number of participants, being an online student was an isolating experience which may have caused or contributed to disengagement with the module. This is in line with previous research which evidenced that feelings of isolation were often experienced by online distance students (Bolliger & Shepherd, 2010 ; Phirangee & Malec, 2017 ).

The module tutor played a very significant role in participants reported online learning engagement experiences. The data revealed that participants placed a high value on the support, reassurance and guidance provided by the module tutor. These findings reinforce the central role of the online teacher for online student success and engagement evidenced in the literature (Buck, 2016 ; Gauvreau et al., 2016 ; O’ Shea et al., 2015 ).

The data revealed that developing time management and organisational skills were key skills for online student success and engagement, which enabled students to balance their lifeload and study. Successful online students developed creative strategies to carve out study time in their already busy lives. This is evident in the unusual and personal study patterns and locations reported. However, this study suggests that for some participants, challenges with time management persisted, causing stress, anxiety and disengagement. These findings ehco previous research which details the persistent challenge for online students to follow a regular study schedule (Blackmon & Major, 2012 ; Brown et al., 2015 ; Buck, 2016 ; Darmody & Fleming, 2009 ; Kahu et al., 2014 ).

Confidence or self-efficacy was also found to have a psychosocial influence on participants’ learning experiences. At the start of the module, some participants expressed feelings of self-doubt, fear, apprehension, uncertainty and that they lacked confidence about their own academic abilities and approaches to studying. For some participants, achievement in assignments had a positive impact on their confidence. Those who engaged in self-regulatory behaviour and proactively addressed their perceived academic weaknesses gained confidence in their academic abilities, which is a point supported by the literature on confidence and online students (Baxter, 2012 , O’ Shea et al., 2015 ).

Strongly emphasised within participant data was the importance of the time and effort they invested in their learning, which is evident in their highly personal approaches to studying. These provide a detailed insight into the study approaches of online students; when, where and how they learned. These study techniques were varied, and innovative, describing traditional approaches such as reading, annotating and note taking and modern approaches to study using online resources such as YouTube videos, online lectures, podcasts, glossaries, online articles and watching recordings of previous online tutorials. Although the approaches to learning and the study techniques of online learners in this study were highly individual, they still conformed to traditional approaches to study such as note taking, reading, assignment preparation and writing. This is broadly in line with previous research carried out by Orton-Johnson ( 2008 ) and Çakıroğlu ( 2014 ) which found that the study habits of online students follow traditional study activities such as reading, note taking and writing assignments and are similar to campus-based students.

In summary, what has emerged from this study as being most important to the engagement and success of this particular cohort of online students, is as follows:

Formal and informal community;

Time management and organisational skills;

Engaging and supportive online teachers;

Multiple means of interaction, not just forums;

Opportunities for skill development, confidence building and self-regulation.

Conclusions

The purpose of this study was to explore the themes that are central to online student engagement experiences in Irish higher education. The findings of this study indicate that successful online student engagement was influenced by a number of psychosocial factors such as peer community, an engaging online teacher, and confidence or self-efficacy and by structural factors such as lifeload and course design. Time management and organisational skills were key skills for online student success and engagement, which enabled students to balance their lifeload and study, however the study indicates persistent challenges for online students to follow a regular study schedule. One limitation of this study, is that it is a relatively small in-depth qualitative case study. However, its findings provide insights into important themes relating to online student engagement, which may inform research practice. For those supporting, teaching and designing online courses, being cognisant of the psychosocial and structural factors that affect online student engagement is valuable. Future research could further examine individual factors which influence online student engagement such as time.

Availability of data and materials

The datasets generated and/or analysed during the current study are not publicly available due to ethical restrictions but are available from the corresponding author on reasonable request.

Andrews, T., & Tynan, B. (2012). Distance learners: Connected, mobile and resourceful individuals. Australasian Journal of Educational Technology , 28 (4), 565–579.

Article   Google Scholar  

Baxter, J. A. (2012). Who am I and what keeps me going? Profiling the distance learning student in higher education. The International Review of Research in Open and Distributed Learning , 13(4), 107–129. https://doi.org/10.19173/irrodl.v13i4.1283 .

Blackmon, S. J., & Major, C. (2012). Student experiences in online courses: A qualitative research synthesis. Quarterly Review of Distance Education , 13 (2), 77–85.

Google Scholar  

Bolliger, D. U., & Shepherd, C. E. (2010). Student perceptions of Eportfolio integration in online courses. Distance Education , 31 (3), 295–314. https://doi.org/10.1080/01587919.2010.513955 .

Bowles, T. V., & Brindle, K. A. (2017). Identifying facilitating factors and barriers to improving student retention rates in tertiary teaching courses: A systematic review. Higher Education Research & Development , 36 (5), 903–919. https://doi.org/10.1080/07294360.2016.1264927 .

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology , 3 (2), 77–101. https://doi.org/10.1191/1478088706qp063oa .

Brown, M., Hughes, H., Keppell, M., Hard, N., & Smith, L. (2015). Stories from students in their first semester of distance Learning. The International Review of Research in Open and Distance Learning , 16 (4), 1–17.

Brunton, J., Brown, M., Costello, E., & Farrell, O. (2018). Head start online: flexibility, transitions and student success. Educational Media International , 55 (4). https://doi.org/10.1080/09523987.2018.1548783 .

Buck, S. (2016). In their own voices: Study habits of distance education students. Journal of Library & Information Services in Distance Learning , 10 (3–4), 137–173. https://doi.org/10.1080/1533290X.2016.1206781 .

Çakıroğlu, Ü. (2014). Analysing the effect of learning styles and study habits of distance learners on learning performances: A case of an introductory programming course. The international review of research in Open and distributed Learning , 15 (4). https://doi.org/10.19173/irrodl.v15i4.1840 .

Castaño-Muñoz, J., Colucci, E., & Smidt, H. (2018). Free digital Learning for inclusion of migrants and refugees in Europe: A qualitative analysis of three types of Learning purposes. The International Review of Research in Open and Distance Learning , 19 (2), 1–21.

Darmody, M., & Fleming, B. (2009). ‘The balancing act’ – Irish part time undergraduate students in higher education. Irish Educational Studies , 28 (1), 67–83. https://doi.org/10.1080/03323310802597333 .

Delahunty, J., Verenikina, I., & Jones, P. (2014). Socio-emotional connections: Identity, belonging and learning in online interactions. A literature review. Technology, Pedagogy and Education , 23 (2), 243–265. https://doi.org/10.1080/1475939X.2013.813405 .

Delaney, L., & Brown, M. (2018). To walk invisible: Distance students in a dual mode university. Distance Education , 39 (2), 209–223. https://doi.org/10.1080/01587919.2018.1457948 .

Delaney, L., & Farren, M. (2016). No ‘self’ left behind? Part-time distance learning university graduates: Social class, graduate identity and employability. Open Learning: The Journal of Open, Distance and e-Learning , 31 (3), 194–208. https://doi.org/10.1080/02680513.2016.1208553 .

Delaney, L., Fox, S. (2013). The role of distance education in broadening access to Irish higher education. In: How equal? Access to higher education in Ireland, 7 Nov 2013, Dublin, Ireland. Retrieved 13th January, 2018, from http://doras.dcu.ie/19966/

Department of Education and Skills. (2011). National Strategy for Higher Education to 2030 Report of the Strategy Group. Dublin: Department of Education and Skills.

European Commission (2014). Report to the European commission on new models of learning and teaching in higher education . Luxembourg: Publications Office of the European Union Retrieved from http://ec.europa.eu/education/library/reports/modernisation-universities_en.pdf .

Fairchild, E. E. (2003). Multiple roles of adult learners. New Directions for Student Services , 102 , 11–16. https://doi.org/10.1002/ss.84 .

Farrell, O., & Seery, A., (2019). “I am not simply learning and regurgitating information, I am also learning about myself”: learning portfolio practice and online distance students. Distance Education , 40 (1). https://doi.org/10.1080/01587919.2018.1553565 .

Frey, J. (2015). The importance of learning experience design for higher education. Retrieved from http://www.gettingsmart.com/2015/04/the-importance-of-learning-experience-design-for-higher-education/

Gauvreau, S. A., Hurst, D., Cleveland-Innes, M., & Hawranik, P. (2016). Online Professional Skills Workshops: Perspectives from Distance Education Graduate Students. The International Review of Research in Open and Distributed Learning , 17(5). https://doi.org/10.19173/irrodl.v17i5.2024 .

HEA (2012). Part-time and flexible higher education in Ireland . Dublin: HEA Retrieved May 11, 2019, from https://www.dkit.ie/system/files/HEA%20Report%20on%20Lifelong%20Learning%202013.pdf .

Higher Education Authority. (2016). Developing Talent,Changing Lives. An Evaluation of Springboard+, 2011–16. Retrieved June 13, 2019, from https://hea.ie/resource-year/2019/publications/https://hea.ie/resource-year/2019/publications/

Higher Education Authority. (2018). Key Facts & Figures: Higher Education 2017/18. Retrieved June 13, 2019, from https://hea.ie/resource-year/2019/publications/https://hea.ie/resource-year/2019/publications/

Holder, B. (2007). An investigation of hope, academics, environment, and motivation as predictors of persistence in higher education online programs. The Internet and Higher Education , 10 (4), 245–260. https://doi.org/10.1016/j.iheduc.2007.08.002 .

Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education , 38 (5), 758–773. https://doi.org/10.1080/03075079.2011.598505 .

Kahu, E. R., & Nelson, K. (2018). Student engagement in the educational interface: Understanding the mechanisms of student success. Higher Education Research & Development , 37 (1), 58–71. https://doi.org/10.1080/07294360.2017.1344197 .

Kahu, E. R., Picton, C., & Nelson, K. (2019). Pathways to engagement: A longitudinal study of the first-year student experience in the educational interface. Higher Education . https://doi.org/10.1007/s10734-019-00429-w .

Kahu, E. R., Stephens, C., Zepke, N., & Leach, L. (2014). Space and time to engage: Mature-aged distance students learn to fit study into their lives. International Journal of Lifelong Education , 33 (4), 523–540. https://doi.org/10.1080/02601370.2014.884177 .

Mallman, M., & Lee, H. (2016). Stigmatised learners: Mature-age students negotiating university culture. British Journal of Sociology of Education , 37 (5), 684–701. https://doi.org/10.1080/01425692.2014.973017 .

McGivney, V. (2004). Understanding persistence in adult learning. Open Learning , 19 (1), 33–46.

Nichols, M. (2011). Intervention for retention through distance education: A comparison study , Project output ed (). New Zealand: Aotearoa: National Centre for Tertiary Teaching Excellence.

O’ Shea, S., Stone, C., & Delahunty, J. (2015). “I ‘feel’ like I am at university even though I am online.” exploring how students narrate their engagement with higher education institutions in an online learning environment. Distance Education , 36 (1), 41. https://doi.org/10.1080/01587919.2015.1019970 .

Orton-Johnson, K. (2008). The online student: Lurking, chatting, flaming and joking. Sociological Research Online. , 12 (6). https://doi.org/10.5153/sro.1615 .

Phirangee, K., & Malec, A. (2017). Othering in online learning: An examination of social presence, identity, and sense of community. Distance Education , 38 (2), 160–172. https://doi.org/10.1080/01587919.2017.1322457 .

Quality matters (2019) Higher Ed. Standards. Accessed from https://www.qualitymatters.org/qa-resources/rubric-standards/higher-ed-rubric .

Roll, I., Russell, D. M., & Gašević, D. (2018). Learning at scale. International Journal of Artificial Intelligence in Education (2018) , 28 (4), 471–477. https://doi.org/10.1007/s40593-018-0170-7 .

Rose Sr., M. (2018). What are some key attributes of effective online teachers? Journal of Open, Flexible and Distance Learning , 22 (2), 32–48.

Sikes, P., & Potts, A. (2008). Researching education from the inside. London: Routledge.

Book   Google Scholar  

Simpson, O. (2004). Supporting students for success in online and distance education . New York: Routledge. https://doi.org/10.4324/9780203416563 .

Stone, C., & O’Shea, S. (2019). Older, online and first: Recommendations for retention and success. Australasian Journal of Educational Technology , 35 (1), 57–69. https://doi.org/10.14742/ajet.3913 .

Veletsianos, G., & Navarrete, C. C. (2012). Online social networks as formal Learning environments: Learner experiences and activities. The International Review of Research in Open and Distributed Learning , 13 (1), 144–166.

Woodley, A., & Simpson, O. (2014). Student dropout: The elephant in the room , Online distance education: Towards a research agenda, 459–484 (). Canada: Athabasca University.

Yang, D., Baldwin, S., & Snelson, C. (2017). Persistence factors revealed: Students' reflections on completing a fully online program. Distance Education , 38 (1), 23. https://doi.org/10.1080/01587919.2017.1299561 .

Yoo, S. J., & Huang, W. D. (2013). Engaging online adult learners in higher education: Motivational factors impacted by gender. Age, and Prior Experiences, The Journal of Continuing Higher Education , 61 (3), 151–164. https://doi.org/10.1080/07377363.2013.836823 .

Young, C., & Perović, N. (2018). ABC Learning Design .

Zembylas, M., Theodorou, M., & Pavlakis, A. (2008). The role of emotions in the experiences of online Learning: Challenges and opportunities. Educational Media International , 45 (2), 107–117. https://doi.org/10.1080/09523980802107237 .

Download references

Acknowledgements

Not applicable.

No funding was received.

Author information

Authors and affiliations.

National Institute for Digital Learning, Dublin City University, Bea Orpen Building, Glasnevin Campus, Dublin 9, Ireland

Orna Farrell & James Brunton

You can also search for this author in PubMed   Google Scholar

Contributions

OF collected and analysed the data and drafted the first draft article. JB contributed to the literature review, methodology and findings sections and substantially edited and revised the first draft article. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Orna Farrell .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Farrell, O., Brunton, J. A balancing act: a window into online student engagement experiences. Int J Educ Technol High Educ 17 , 25 (2020). https://doi.org/10.1186/s41239-020-00199-x

Download citation

Received : 27 August 2019

Accepted : 31 March 2020

Published : 29 April 2020

DOI : https://doi.org/10.1186/s41239-020-00199-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online student
  • Student success
  • Online learning
  • Qualitative
  • Student voice

case study about student engagement

Advertisement

Advertisement

Undergraduate student engagement at a Chinese university: a case study

  • Published: 24 February 2015
  • Volume 27 , pages 105–127, ( 2015 )

Cite this article

case study about student engagement

  • Zhe Zhang 1 , 2 ,
  • Wenhua Hu 3 &
  • Olwen McNamara 1  

2143 Accesses

23 Citations

2 Altmetric

Explore all metrics

Student engagement in higher education has attracted worldwide attention in recent years because of its strong correlation with positive outcomes of student learning and also, increasingly, because of its influence on a consumer-oriented global education market. Such issues come into sharp focus in the case of China, currently the largest international market for higher education in Western Europe, Australia, and North America. As a growing number of Chinese universities become global players, they are competing with their Western European, Australian, and North American counterparts for a burgeoning home market of more discerning and cost conscious consumers. Nevertheless, limited attention has been paid by the higher education sector as to how student engagement is conceptualized by those consumers and what factors inform and influence their perceptions and choices. This case study of undergraduate student engagement at a Chinese university attempts to begin to answer these questions. It analyzed data collected through interviews and focus groups to investigate student and staff conceptualizations of student engagement and the factors that influenced it. The factors were categorized into external factors (contextual and institutional) and internal factors (personal). A sociocultural analysis identified three issues: transition , lack of student – staff interaction , and shock students .

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA) Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

case study about student engagement

Adoption of online mathematics learning in Ugandan government universities during the COVID-19 pandemic: pre-service teachers’ behavioural intention and challenges

case study about student engagement

Online learning in higher education: exploring advantages and disadvantages for engagement

case study about student engagement

Learning environments’ influence on students’ learning experience in an Australian Faculty of Business and Economics

The responsibility of personal tutors mainly involves providing support for students’ non-academic needs and personal development.

In view of the fact that Figs.  1 and 2 only include factors that have emerged so far, more research needs to be done in the future.

Astin, A. W. (1984). Student involvement: a developmental theory for higher education. Journal of College Student Development, 25 , 297–308.

Google Scholar  

Battiste, M. (2008). Research ethics for protecting indigenous knowledge and heritage: institutional and researcher responsibilities. In N. K. Denzin, Y. S. Lincoln, & L. T. Smith (Eds.), Handbook of critical and indigenous methodologies (pp. 497–509). Thousand Oaks: Sage.

Chapter   Google Scholar  

Bensimon, E. M. (2009). Foreword. In S. R. Harper & S. J. Quaye, eds. Student engagement in higher education . New York and London: Routledge, pp. xxi–xxvi.

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university (3rd ed.). Berkshire: SRHE and Open University Press Imprint.

Bryman, A. (2004). Social research methods (2nd ed.). New York: Oxford University Press.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39 (7), 3–7.

Chickering, A. W., & Reisser, L. (1993). Education and identity . San Francisco: Jossey-Bass.

Christie, H., et al. (2008). “A real rollercoaster of confidence and emotions”: learning to be a university student. Studies in Higher Education, 33 , 567–581.

Article   Google Scholar  

Coates, H. (2005). The value of student engagement for higher education quality assurance. Quality in Higher Education, 11 (1), 25–36.

Cortazzi, M., Pilcher, N., & Jin, L. (2011). Language choices and “blind shadows”: investigating interviews with Chinese participants. Qualitative Research, 11 (5), 505–535.

Cothran, D. J., & Ennis, C. D. (2000). Building bridges to student engagement: communicating respect and care for students in urban high schools. Journal of Research and Development in Education, 33 (2), 106–117.

Fredricks, J. A., Blumenfield, P. C., & Paris, A. H. (2004). School engagement: potential of the concept, state of the evidence. Review of Educational Research, 74 (1), 59–109.

Gellin, A. (2003). The effect of undergraduate student involvement on critical thinking: A meta-analysis of the literature, 1991–2000. Journal of College Student Development, 44 (6), 746–762.

Geyer, F. (2001). Sociology of alienation. In N. J. Smelser & P. B. Baltes (Eds.), International encyclopedia of the social & behavioral sciences (pp. 388–392). Oxford: Pergamon.

Griffiths, D. S., Winstanley, D., & Gabriel, Y. (2005). Learning shock: the trauma of return to formal learning. Management Learning, 36 , 275–297.

Halai, N. (2007). Making use of bilingual interview data: some experiences from the field. The Qualitative Report, 12 (3), 344–355.

HEFCE (2008). Tender for a study into student engagement . Bristol: Higher Education Funding Council for England.

HEFCE (2012). National Student Survey. Available at: http://www.hefce.ac.uk/whatwedo/lt/publicinfo/nationalstudentsurvey/ [Accessed July 21, 2012].

Horwitz, E. K., et al. (1986). Foreign language classroom anxiety. The Modern Language Journal, 70 (2), 125–132.

Hu, S., & Kuh, G. D. (2001). Being (dis)engaged in educationally purposeful activities: the influences of student and institutional characteristics. In American Educational Research Association Annual Conference . Seattle, WA.

Hu, S., & Kuh, G. D. (2002). Being (dis)engaged in educationally purposeful activities: the influences of student and institutional characteristics. Research in Higher Education, 43 (5), 555–575.

Indiana University Centre for Postsecondary Research. (2002). From promise to progress: How colleges and universities are using student engagement results to improve collegiate quality . Bloomington, IN: Indiana Center for postsecondary Research.

Jackson, C. (2003). Transitions into higher education: gendered implications for academic self-concept. Oxford Review of Education, 29 (3), 331–346.

Jonasson, C. (2012). Teachers and students’ divergent perceptions of student engagement: recognition of school or workplace goals. British Journal of Sociology of Education, 33 (5), 723–741.

Kahu, E. R. (2011). Framing student engagement in higher education. Studies in higher education , (doi: 10.1080/03075079.2011.598505 ), pp. 1–16. Available at: doi: 10.1080/03075079.2011.598505 .

Koljatic, M., & Kuh, G. D. (2001). A longitudinal assessment of college student engagement in good practices in undergraduate education. Higher Education, 42 , 351–371.

Krause, K. (2005). Engaged, inert or otherwise occupied? Deconstructing the 21st century undergraduate student. In Sharing Scholarship in Learning and Teaching: Engaging Students Symposium . Queensland, Australia.

Krause, K., & Coates, H. (2008). Students’ engagement in first-year university. Assessment and Evaluation in Higher Education, 33 (5), 493–505.

Kuh, G. D. (1993). In their own words: What students learn outside the classroom. American Educational Research Journal, 30 (2), 277–304.

Kuh, G. D. (1995). The other curriculum: Out-of-class experiences associated with student learning and personal development. Journal of Higher Education, 66 (2), 123–155.

Kuh, G. D. (2009). What student affairs professionals need to know about student engagement. Journal of College Student Development, 50 (6), 683–706.

Kuh, G. D., & Hu, S. (2001). The effects of student faculty interaction in the 1990s. Review of Higher Education, 24 (3), 309–332.

Kuh, G. D. et al. (2005). Never let it rest: Lessons about student success from high-performing colleges and universities. Change: The Magazine of Higher Learning, 37 (4), 44–51.

Kuh, G. D. et al., (2007). Piecing together the student success puzzle: research, propositions, and recommendations. ASHE Higher Education Report, 32 (5).

Lacey, C. (2012). The socialization of teachers . London and New York: Routledge.

Lave, J., & Wenger, E. (1991). Situated learning: legitimate peripheral participation . Cambridge: Cambridge University Press.

Book   Google Scholar  

Lin, J. G., & Yi, J. K. (1997). Asian international students’ adjustment: issues and program suggestions. College Student Journal, 31 (4), 473–484.

Little, B., et al. (2009). Report to HEFCE on student engagement . London: Centre for Higher Education Research and Information.

Luo, X. Y., & Chen, J. Y. (2007). NSSE: a learning-focused higher education quality assessment. Comparative Education Review, 10 , 50–54.

Luo, Y., Ross, H., & Cen, Y. H. (2009a). Higher education measurement in the context of globalization—the development of NSSE-China: cultural adaptation, reliability and validity. Fudan Education Forum, 7 (5), 12–18.

Luo, Y., Shi, J. H., & Tu, D. B. (2009b). Annual report of Tsinghua College Education survey 2009: comparing with American top research universities. Tsinghua Journal of Education, 30 (5), 1–13.

Mann, S. (2001). Alternative perspectives on the student experience: alienation and engagement. Studies in Higher Education, 26 , 7–19.

McInnis, C. (2001). Signs of disengagement? The changing undergraduate experience in Australian universities: inaugural professorial lecture . Melbourne: Centre for the Study of Higher Education.

Merriam, S. B. (1998). Qualitative research and case study applications in education . San Francisco: Jossey-Bass (Revised and expanded from I case study research in education/I (Jossey Bass Education Series).

Merwin, J. C. (1969). Historical view of changing concepts of evaluation. In R. L. Tyler (Ed.), Educational evaluation: new roles, new methods (68th Yearbook of the National Society for the Study of Education, part II) (pp. 6–25). Chicago: University of Chicago Press.

Pace, C. R. (1980). Measuring the quality of student effort. Current Issues in Higher Education, 2 , 10–16.

Pace, C. R. (1984). Measuring the quality of college student experiences: an account of the development and use of the college student experiences questionnaire . Los Angeles: Higher Education Research Institute.

Pascarella, E. T., & Terenzini, P. T. (1991). How college affects students: findings and insights from twenty years of research . San Francisco: Jossey-Bass.

Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research (vol. 2). San Francisco: Jossey-Bass.

Pascarella, E. T., Seifert, T. A., & Blaich, C. (2010). How effective are the NSSE benchmarks in predicting important educational outcomes? Change: The Magazine of Higher Learning, 42 (1), 16–22.

Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). Newbery Park: Sage.

Pike, G. R., & Kuh, G. D. (2005). A typology of student engagement for American colleges and universities. Research in Higher Education, 46 (2), 185–209.

Pike, G. R., Kuh, G. D., & Gonyea, R. M. (2003). The relationship between institutional mission and students’involvement and educational outcomes. Research in Higher Education, 44 (2), 243–263.

RAISE (2010). Homepage. Available at: http://raise-network.ning.com/ [Accessed July 21, 2012].

Robson, C. (2011). Real world research (3rd ed.). West Sussex: Wiley.

Ross, H., Luo, Y., & Cen, Y. H. (2008). Assessing Tsinghua University and US universities on learning process indicators: an approach of higher education quality. Tsinghua Journal of Education, 29 (2), 36–42.

Ross, H., Cen, Y. H., & Zhou, Z. J. (2011). Assessing student engagement in China: responding to local and global discourse on raising educational quality. Current Issues in Comparative Education, 14 (1), 24–37.

Samuelowicz, K. (1987). Learning problems of overseas student. Higher Education Research and Development, 6 (2), 121–134.

Shulman, L. S. (2002). Making differences: A table of learning. Change, 34 (6), 35–44.

Temple, B., & Yong, A. (2004). Qualitative research and translation dilemmas. Qualitative Research, 4 (2), 161–178.

Thomas, L. (2002). Student retention in higher education: the role of institutional habitus. Journal of Education Policy, 17 , 423–442.

Tinto, V. (1987). Leaving college: rethinking the causes and cures of student attrition . Chicago: University of Chicago Press.

Tinto, V. (1993). Leaving college: rethinking the causes and cures of student attrition (2nd ed.). Chicago: University of Chicago Press.

Tross, S. A. et al. (2000). Not just the usual cast of characteristics: Using personality to predict college performance and retention. Journal of College Student Development, 41 (3), 325–336.

Trowler, V. (2010). Student engagement literature review. Available at: http://www.heacademy.ac.uk/assets/York/documents/ourwork/studentengagement/StudentEngagementLiteratureReview.pdf [Accessed July 20, 2012].

Trowler, V., & Trowler, P. (2010). Student engagement evidence summary. Available at: http://www.heacademy.ac.uk/assets/York/documents/ourwork/studentengagement/StudentEngagementEvidenceSummary.pdf [Accessed July 20, 2012].

Wang, S. (2011). The impact on student learning of student engagement in research universities—based on “NSSE-China” 2009 data analysis. Tsinghua Journal of Education, 32 (4), 24–32.

Wenger, E. (1998). Communities of practice: learning, meaning, and identity . Cambridge: Cambridge University Press.

Wenger, E. (2011). Identification and learning in landscapes of practice. Available at: http://www.ternesrcseminars.net/EtienneWengerbio.htm [Accessed July 25, 2012].

Zhu, H. (2010). The relationship between student engagement and college student achievement—the analysis of 2010 annual data set of Beijing college student survey. Tsinghua Journal of Education, 31 (6), 35–43.

Download references

Author information

Authors and affiliations.

Manchester Institute of Education, The University of Manchester, Ellen Wilkinson Building, Oxford Road, Manchester, M13 9PL, UK

Zhe Zhang & Olwen McNamara

School of Foreign Languages and Literature, Shandong University, 27 Shanda Nanlu, 250100, Jinan, China

School of International Education, Shandong University of Finance and Economics, 7366 Erhuan Donglu, 250014, Jinan, China

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Zhe Zhang .

Rights and permissions

Reprints and permissions

About this article

Zhang, Z., Hu, W. & McNamara, O. Undergraduate student engagement at a Chinese university: a case study. Educ Asse Eval Acc 27 , 105–127 (2015). https://doi.org/10.1007/s11092-015-9213-x

Download citation

Received : 01 October 2013

Accepted : 14 January 2015

Published : 24 February 2015

Issue Date : May 2015

DOI : https://doi.org/10.1007/s11092-015-9213-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Chinese student engagement
  • Higher education
  • Student–staff interaction
  • Shock students
  • Influencing factors
  • Find a journal
  • Publish with us
  • Track your research

NStEP

NStEP Case Study Hub

Welcome to the new nstep case study hub where you will find examples of good practice in student engagement and student-staff partnerships from across irish higher education. , at nstep we believe the best way to enhance student engagement in higher education is to support students and staff to share with and learn from one another..., student-staff partnerships in action.

Click the image or case study title to view more.

#RCSIPulseCheck

This student-led project brought together students, academics, and the wider community to promote childhood physical health.

Authors: Andrew Cummiskey, Maria Kelly, and Grace O’Malley

RCSI University of Medicine and Health Sciences

The Impact of Commuting on the Student Experience

This student-staff research project aimed to improve the experience of students who commute to college.

Authors: Margaret Keogh, Beth McKeague, and Denis Casey

Maynooth University

Student Teacher Educational Research (STER)

This project supports students to develop research skills while leading the co-publication of an academic journal, conference and podcast.

Authors: Aimie Brennan

Marino Institute of Education

Diversifying the Curriculum: Building MultiStories

A collaborative project between academics, librarians, and students led to a wider institutional impact on inclusive knowledge creation.

Authors: Fionnuala Darby and Lindsay Dowling

Technological University Dublin

TIPToP Conference

This student-staff partnership supports transferable skills development through the co-creation of a biomedical science conference.

Authors: Lesley Cotter and Bridget Lucey

Munster Technological University

Student Engagement in Teaching and Learning Enhancement

Novel hyflex technologies to enhance classroom student engagement.

Working in partnership with students, this enhancement project led to co-created engagement methods in the classroom.

Authors: Sinéad Hurley, Thomas Buckley, Delfina Mancebo Guinea Arquez, and Debbi Stanistreet

Afri Hedge School: Annual Human Rights Hedge School

Over a number of years, this student-staff collaborative Hedge School has deepened learning in human rights, challenging racism, and sustainable development.

Authors: Liam McGlynn and Garreth Smith

PoCUS Ultrasound Education in the Undergraduate Curriculum

In this case study, staff collaborated with 5th year radiology students to create student-led ultrasound workshops for 3rd years.

Authors: Claire Condron, Miroslav Voborsky, and John Karp

Professional Development and Graduate Attributes within Pharmacy Technician Studies

This project engaged students in the development of graduate attributes, leading to collaborative charitable and skills development initiatives.

Authors: Tao Zhang, Julie Dunne, Kathy Young, Gemma Kinsella, and Seána Hogan

Teach Digi Podcast: "Ag Caint"

This student-staff partnership aimed to support staff professional development and student engagement in digital teaching and learning.

Authors: Clíodhna O’Callaghan and Stephen O’Riordan

University College Cork

Student Partnership Approaches to Assessment

Student partnership in assessment (sapia).

This case study explores the wider concept of SaPiA and outlines how this informed student-staff assessment projects in DCU.

Author: Rob Lowney

Dublin City University

BALI - Building Assessment Literacy Initiatives

This student-staff partnership co-produced resources and supports to enhance assessment and feedback practices.

Authors: Cliona Hatano, William Carey, and Sinéad Huskisson

Student Partnership in Assessment Design

This reflective case study from a teacher and a student demonstrates the power of partnership in assessment.

Authors: Margaret Finch and Saffron Williams

Student Assessors for Pharmacy Technician Practical Assessments

This case study promotes the value of student-led assessment to deepen learning and enhance practical work-based experience.

Authors: Diane Patterson and Niamh Dillon

Technological University of the Shannon

Addressing Racism Through Advocacy in Action

In this student-staff co-created continuous assessment project, the outcomes for learning and societal change stemmed from inclusivity and participation.

Author: Georgina Lawlor

Institutional Approaches to Assessment and Feedback Literacy

This case study demonstrates that assessment and feedback can provide a wider opportunity to enhance student partnership practices across the institution.

Authors: Margaret Finch and Aileen Kennedy

Developing Student-Led Peer Support

Peer learning and student partnership.

The CCT Peer Mentoring Academy has evolved and deepened student partnership approaches that support the wider student body.

Authors: Aldana Louzan Aldani and Marie O’Neill

CCT College Dublin

CÉIM Shared Learning

This students’ union-led initiative provides a peer-assisted learning model to over 2000 students annually.

Authors: Amber Walsh Olesen and Niamh Tobin

University of Galway

P2P Peer to Peer Support - Students Helping Students

Peer to Peer at SETU Carlow has developed partnership in action through student-led production of peer publications and resources.

Authors: Mary Boylan, Helena Fitzgerald, Yvonne Kavanagh, David Denieffe, and Alannah Somers

South East Technological University

If you wish to contact any of the authors or submit a query related to the Case Study Hub, please email [email protected]  

  • Architecture and Design
  • Asian and Pacific Studies
  • Business and Economics
  • Classical and Ancient Near Eastern Studies
  • Computer Sciences
  • Cultural Studies
  • Engineering
  • General Interest
  • Geosciences
  • Industrial Chemistry
  • Islamic and Middle Eastern Studies
  • Jewish Studies
  • Library and Information Science, Book Studies
  • Life Sciences
  • Linguistics and Semiotics
  • Literary Studies
  • Materials Sciences
  • Mathematics
  • Social Sciences
  • Sports and Recreation
  • Theology and Religion
  • Publish your article
  • The role of authors
  • Promoting your article
  • Abstracting & indexing
  • Publishing Ethics
  • Why publish with De Gruyter
  • How to publish with De Gruyter
  • Our book series
  • Our subject areas
  • Your digital product at De Gruyter
  • Contribute to our reference works
  • Product information
  • Tools & resources
  • Product Information
  • Promotional Materials
  • Orders and Inquiries
  • FAQ for Library Suppliers and Book Sellers
  • Repository Policy
  • Free access policy
  • Open Access agreements
  • Database portals
  • For Authors
  • Customer service
  • People + Culture
  • Journal Management
  • How to join us
  • Working at De Gruyter
  • Mission & Vision
  • De Gruyter Foundation
  • De Gruyter Ebound
  • Our Responsibility
  • Partner publishers

case study about student engagement

Your purchase has been completed. Your documents are now available to view.

Teacher behaviour and student engagement with L2 writing feedback: a case study

Feedback is essential for student learning and engagement is key for its efficacy. Yet research on student engagement with feedback predominantly attributes it to learner factors, overlooking teacher influence. This case study explored how one writing teacher’s behaviours shaped a motivated undergraduate’s engagement with various types of feedback in a writing course over one semester. Data sources included interviews, class observations, and text analysis. Findings revealed the pivotal role of teacher feedback behaviours in shaping student engagement, often through complex interactions with learner factors and teacher non-feedback behaviours. While some feedback behaviours enhanced student engagement, most had negligible or detrimental effects, highlighting the contextual nature of “best practices”. Certain teacher behaviours also exerted lasting impacts on student engagement. Additionally, some teacher non-feedback behaviours, both teaching and non-teaching, also contributed to shaping student engagement. These findings have implications for both research and teacher education.

Funding source: Double First Class University Plan Fund of Lanzhou University

Award Identifier / Grant number: 561119204

Appendix A: Student engagement coding scheme.

Appendix b: teacher feedback behavior categorization fl = feedback literate, fi = feedback illiterate..

Assor, Avi, Haya Kaplan, Yaniv Kanat-Maymon & Guy Roth. 2005. Directly controlling teacher behaviors as predictors of poor motivation and engagement in girls and boys: The role of anger and anxiety. Learning and Instruction 15. 397–413. https://doi.org/10.1016/j.learninstruc.2005.07.008 . Search in Google Scholar

Assor, Avi, Haya Kaplan & Guy Roth. 2002. Choice is good, but relevance is excellent: Autonomy-enhancing and suppressing teacher behaviours predicting students’ engagement in schoolwork. British Journal of Educational Psychology 72. 261–278. https://doi.org/10.1348/000709902158883 . Search in Google Scholar

Boud, David & Philip Dawson. 2023. What feedback literate teachers do: An empirically-derived competency framework. Assessment & Evaluation in Higher Education 48. 158–171. https://doi.org/10.1080/02602938.2021.1910928 . Search in Google Scholar

Buskist, William, Jason Sikorski, Tanya Buckley & Bryan K. Saville. 2002. Elements of master teaching. In Stephen F. Davis & William Buskist (eds.), The teaching of psychology: Essays in honor of Wilbert J. McKeachie and Charles L. Brewer , 30–39. Mahwah, New Jersey: Lawrence Earlbaum Associates. Search in Google Scholar

Carless, David, Kennedy Kam Ho Chan, Jessica To, Margaret Lo & Elizabeth Barret. 2018. Developing students’ capacities for evaluative judgement through analysing exemplars. In David Boud, Rola Ajjawi, Philip Dawson & Joanna Tai (eds.), Developing evaluative judgement in higher education: Assessment for knowing and producing quality work , 108–116. Abingdon, Oxon & New York: Routledge. 10.4324/9781315109251-12 Search in Google Scholar

Carless, David & Naomi Winstone. 2023. Teacher feedback literacy and its interplay with student feedback literacy. Teaching in Higher Education 28. 150–163. https://doi.org/10.1080/13562517.2020.1782372 . Search in Google Scholar

Casanave, Christine Pearson. 2003. Looking ahead to more sociopolitically-oriented case study research in L2 writing scholarship. Journal of Second Language Writing 12. 85–102. https://doi.org/10.1016/s1060-3743(03)00002-x . Search in Google Scholar

Cheong, Choo Mui, Na Luo, Xinhua Zhu, Qi Lu & Wei Wei. 2022. Self-assessment complements peer assessment for undergraduate students in an academic writing task. Assessment & Evaluation in Higher Education 48. 1–15. https://doi.org/10.1080/02602938.2022.2069225 . Search in Google Scholar

Dewaele, Jean-Marc & Chengchen Li. 2021. Teacher enthusiasm and students’ social-behavioral learning engagement: The mediating role of student enjoyment and boredom in Chinese EFL classes. Language Teaching Research 25. 922–945. https://doi.org/10.1177/13621688211014538 . Search in Google Scholar

Dewaele, Jean-Marc, Kazuya Saito & Florentina Halimi. 2022. How teacher behaviour shapes foreign language learners’ enjoyment, anxiety and attitudes/motivation: A mixed modelling longitudinal investigation. Language Teaching Research . 1–23. https://doi.org/10.1177/13621688221089601 . Search in Google Scholar

Donker, Monika H., Lian van Vemde, David J. Hessen, Tamara van Gog & Tim Mainhard. 2021. Observational, student, and teacher perspectives on interpersonal teacher behavior: Shared and unique associations with teacher and student emotions. Learning and Instruction 73. 101414. https://doi.org/10.1016/j.learninstruc.2020.101414 . Search in Google Scholar

Ellis, Rod. 2010. EPILOGUE: A framework for investigating oral and written corrective feedback. Studies in Second Language Acquisition 32. 335–349. https://doi.org/10.1017/s0272263109990544 . Search in Google Scholar

Finn, Jeremy D. & Kayla S. Zimmer. 2012. Student engagement: What is it? Why does it matter? In Sandra L. Christenson, Amy L. Reschly & Cathy Wylie (eds.), Handbook of research on student engagement , 97–122. New York: Springer. 10.1007/978-1-4614-2018-7_5 Search in Google Scholar

Fredricks, Jennifer A., Phyllis C. Blumenfeld & Alison H. Paris. 2004. School engagement: Potential of the concept, state of the evidence. Review of Educational Research 74. 59–109. https://doi.org/10.3102/00346543074001059 . Search in Google Scholar

Han, Ye. 2019. Written corrective feedback from an ecological perspective: The interaction between the context and individual learners. System 80. 288–303. https://doi.org/10.1016/j.system.2018.12.009 . Search in Google Scholar

Han, Ye & Fiona Hyland. 2015. Exploring learner engagement with written corrective feedback in a Chinese tertiary EFL classroom. Journal of Second Language Writing 30. 31–44. https://doi.org/10.1016/j.jslw.2015.08.002 . Search in Google Scholar

Han, Ye & Fiona Hyland. 2016. Oral corrective feedback on L2 writing from a sociocultural perspective: A case study on two writing conferences in a Chinese university. Writing & Pedagogy 8. 433–459. https://doi.org/10.1558/wap.27165 . Search in Google Scholar

Han, Ye & Yueting Xu. 2021. Student feedback literacy and engagement with feedback: A case study of Chinese undergraduate students. Teaching in Higher Education 26. 181–196. https://doi.org/10.1080/13562517.2019.1648410 . Search in Google Scholar

Heron, Marion, Emma Medland, Naomi Winstone & Edd Pitt. 2023. Developing the relational in teacher feedback literacy: Exploring feedback talk. Assessment & Evaluation in Higher Education 48. 172–185. https://doi.org/10.1080/02602938.2021.1932735 . Search in Google Scholar

Hsiao, Jo-Chi, Ssu-Kuang Chen, Wei Chen & Sunny S. J. Lin. 2022. Developing a plugged-in class observation protocol in high-school blended STEM classes: Student engagement, teacher behaviors and student-teacher interaction patterns. Computers & Education 178. 104403. https://doi.org/10.1016/j.compedu.2021.104403 . Search in Google Scholar

Hyland, Fiona. 2000. ESL writers and feedback: Giving more autonomy to students. Language Teaching Research 4. 33–35. https://doi.org/10.1191/136216800674812889 . Search in Google Scholar

Hyland, Ken. 2003. Second language writing . Cambridge: Cambridge University Press. 10.1017/CBO9780511667251 Search in Google Scholar

Kuril, Samvet, Vishal Gupta & Vijaya Sherry Chand. 2021. Relationship between negative teacher behaviors and student engagement: Evidence from India. International Journal of Educational Research 109. 101858. https://doi.org/10.1016/j.ijer.2021.101858 . Search in Google Scholar

Lee, Icy. 2004. Error correction in L2 secondary writing classrooms: The case of Hong Kong. Journal of Second Language Writing 13. 285–312. https://doi.org/10.1016/j.jslw.2004.08.001 . Search in Google Scholar

Lee, Icy. 2008. Student reactions to teacher feedback in two Hong Kong secondary classrooms. Journal of Second Language Writing 17. 144–164. https://doi.org/10.1016/j.jslw.2007.12.001 . Search in Google Scholar

Lee, Icy. 2010. Writing teacher education and teacher learning: Testimonies of four EFL teachers. Journal of Second Language Writing 19. 143–157. https://doi.org/10.1016/j.jslw.2010.05.001 . Search in Google Scholar

Lee, Icy. 2019. Teacher written corrective feedback: Less is more. Language Teaching 52. 524–536. https://doi.org/10.1017/s0261444819000247 . Search in Google Scholar

Lee, Icy. 2021. The development of feedback literacy for writing teachers. TESOL Quarterly 55. 1048–1059. https://doi.org/10.1002/tesq.3012 . Search in Google Scholar

Lee, Icy, Na Luo & Pauline Mak. 2021. Teachers’ attempts at focused written corrective feedback. Journal of Second Language Writing 54. 100809. https://doi.org/10.1016/j.jslw.2021.100809 . Search in Google Scholar

Li, Jinrong, Stephanie Link & Volker Hegelheimer. 2015. Rethinking the role of automated writing evaluation (AWE) feedback in ESL writing instruction. Journal of Second Language Writing 27. 1–18. https://doi.org/10.1016/j.jslw.2014.10.004 . Search in Google Scholar

Patall, Erika A., Keenan A. Pituch, Rebecca R. Steingut, Ariana C. Vasquez, Nicole Yates & Alana A. U. Kennedy. 2019. Agency and high school science students’ motivation, engagement, and classroom support experiences. Journal of Applied Developmental Psychology 62. 77–92. https://doi.org/10.1016/j.appdev.2019.01.004 . Search in Google Scholar

Price, Margaret, Karen Handley & Jill Millar. 2011. Feedback: Focusing attention on engagement. Studies in Higher Education 36. 879–896. https://doi.org/10.1080/03075079.2010.483513 . Search in Google Scholar

Reeve, Johnmarshall. 2013. How students create motivationally supportive learning environments for themselves: The concept of agentic engagement. Journal of Educational Psychology 105. 579–595. https://doi.org/10.1037/a0032690 . Search in Google Scholar

Reeve, Johnmarshall & Ching-Mei Tseng. 2011. Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology 36. 257–267. https://doi.org/10.1016/j.cedpsych.2011.05.002 . Search in Google Scholar

Reschly, Amy L. & Sandra L. Christenson. 2012. Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In Sandra L. Christenson, Amy L. Reschly & Cathy Wylie (eds.), Handbook of research on student engagement , 3–20. New York: Springer. 10.1007/978-1-4614-2018-7_1 Search in Google Scholar

Skinner, Ellen A. & Michael J. Belmont. 1993. Motivation in the classroom: Reciprocal effects of teacher behaviour and student engagement across the school year. Journal of Educational Psychology 85. 571–581. https://doi.org/10.1037/0022-0663.85.4.571 . Search in Google Scholar

Skinner, Ellen A. & Jennifer R. Pitzer. 2012. Developmental dynamics of student engagement, coping, and everyday resilience. In Sandra L. Christenson, Amy L. Reschly & Cathy Wylie (eds.), Handbook of research on student engagement , 21–44. New York: Springer. 10.1007/978-1-4614-2018-7_2 Search in Google Scholar

Uden, van Jolien M., Henk Ritzen & Jules M. Pieters. 2014. Engaging students: The role of teacher beliefs and interpersonal teacher behavior in fostering student engagement in vocational education. Teaching and Teacher Education 37. 21–32. https://doi.org/10.1016/j.tate.2013.08.005 . Search in Google Scholar

Winstone, Naomi E. & David Boud. 2022. The need to disentangle assessment and feedback in higher education. Studies in Higher Education 47. 656–667. https://doi.org/10.1080/03075079.2020.1779687 . Search in Google Scholar

Wullschleger, Andrea, Ariana Garrote, Susanne Schnepel, Lea Jaquiéry & Elizabeth Moser Opitz. 2020. Effects of teacher feedback behavior on social acceptance in inclusive elementary classrooms: Exploring social referencing processes in a natural setting. Contemporary Educational Psychology 60. 101841. https://doi.org/10.1016/j.cedpsych.2020.101841 . Search in Google Scholar

Xu, Yueting & David Carless. 2017. ‘Only true friends could be cruelly honest’: Cognitive scaffolding and social-affective support in teacher feedback literacy. Assessment & Evaluation in Higher Education 42. 1082–1094. https://doi.org/10.1080/02602938.2016.1226759 . Search in Google Scholar

You, Xiaoye. 2004. “The choice made from no choice”: English writing instruction in a Chinese university. Journal of Second Language Writing 13. 97–110. https://doi.org/10.1016/j.jslw.2003.11.001 . Search in Google Scholar

Zhang, Zhe (Victor) & Ken Hyland. 2022. Fostering student engagement with feedback: An integrated approach. Assessing Writing 51. 100586. https://doi.org/10.1016/j.asw.2021.100586 . Search in Google Scholar

Zheng, Yao & Shulin Yu. 2018. Student engagement with teacher written corrective feedback in EFL writing: A case study of Chinese lower-proficiency students. Assessing Writing 37. 13–24. https://doi.org/10.1016/j.asw.2018.03.001 . Search in Google Scholar

Zheng, Yao, Shulin Yu, Bo Wang & Yiran Zhang. 2020. Exploring student engagement with supervisor feedback on master’s thesis: Insights from a case study. Innovations in Education and Teaching International 57. 186–197. https://doi.org/10.1080/14703297.2019.1617181 . Search in Google Scholar

© 2023 Walter de Gruyter GmbH, Berlin/Boston

  • X / Twitter

Supplementary Materials

Please login or register with De Gruyter to order this product.

International Review of Applied Linguistics in Language Teaching

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Student Engagement Case Studies

Profile image of Vicki Trowler

Related Papers

Student Engagement and Experience Journal

Christine O'Leary

case study about student engagement

Vicki Trowler

Active Learning in Higher Education

Springer eBooks

Paul Ashwin

Paul Trowler , Vicki Trowler

Higher Education Policy

Bruce Macfarlane

John P. Portelli

LEARNing Landscapes 1 (1) 93-117

David Zyngier

The challenge of student engagement has been recognised as a serious issue, especially in the middle years of schooling in Australian education. This qualitative study seeks to understand the experiences of one group of students beginning their high school years. Students are often left out of the discourse on student engagement. Traditionally they are objectified and omitted from the dialogue because often they are viewed as products of formal education systems. By giving voice to students, I compare and contrast the various and contested understandings of authentic or generative aspects of student engagement and what these might mean for classroom practice. I suggest that pedagogical practices that connect to students’ lives are too often ignored but necessary elements of teacher pedagogy for all students, particularly, those from disadvantaged and minority backgrounds. I identify and examine three contesting epistemological constructs of student engagement in order to answer three interrelated questions: (i) What are the most worthwhile conceptions of engagement? (ii) What are the purposes of engagement? (iii) Who benefits (and who is excluded) from these purposes? I conclude that not all forms of student engagement are equal.

RELATED PAPERS

Szakmai ismeretek és készségek – átalakuló hivatás : Válogatott könyvtártudományi tanulmányok

Ágnes Dárdai

Scientific Reports

Francesco Maccherozzi

Corrosion Science

Nicholas Martyak

Journal of Economics and Behavioral Studies

Bokang A Ncube

Revista Brasileira de Epidemiologia

Maria Suely Silva Melo

Revista Ingenierías …

Natalia Valderrama

Proceedings of the National Academy of Sciences

Yuko Mori-Akiyama

International Journal of Innovative Research and Development

JOHN KIWEEWA

Paweł Bytniewski

Journal of Public Health

Priscila Koeller

… series: Medicine and …

Dragan Krstic

abbas doroodgar

Nuclear Engineering and Design

Márton Király

The Spine Journal

Ñawpa pacha

Lisa DeLeonardis

Marie Jacquet

El Bukhari Institute

Syahrul Rahmat

Gilmar Gomes

Revista Educación

Hermann Zegarra

Tetrahedron

Marek Cypryk

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Student Engagement Case Studies

These case studies provide examples of student engagement initiatives in UK institutions. They include interventions both inside and outside the classroom and represent a range of approaches to "engagement" and to fostering that engagement. They have been compiled by Department of Educational Research University of Lancaster for the HEA Student Engagement Project

student_engagement_case_studies.pdf

The materials published on this page were originally created by the Higher Education Academy.

©Advance HE 2020. Company limited by guarantee registered in England and Wales no. 04931031 | Company limited by guarantee registered in Ireland no. 703150 | Registered charity, England and Wales 1101607 | Registered charity, Scotland SC043946 | VAT Registered number GB 152 1219 50. Registered UK Address: Advance HE, Innovation Way, York Science Park, Heslington, York, YO10 5BR, United Kingdom | Registered Ireland Address: Advance HE, First Floor, Penrose 1, Penrose Dock, Cork, T23 Kw81, Ireland.

  • Open access
  • Published: 14 May 2024

The community of inquiry as a tool for measuring student engagement in blended massive open online courses (MOOCs): a case study of university students in a developing country

  • John Kwame Eduafo Edumadze   ORCID: orcid.org/0000-0003-2422-4909 1 , 2 &
  • Desmond Welsey Govender 2  

Smart Learning Environments volume  11 , Article number:  19 ( 2024 ) Cite this article

138 Accesses

Metrics details

While massive open online courses (MOOCs) promise to democratise access to education, the literature reveals a nuanced understanding of engagement in these settings, especially in resource-constrained environments. Blended MOOCs combine MOOCs and physical classroom settings of contents and instructions. This study extends this discourse by focusing on blended MOOCs, which remain under-explored in the context of developing countries. The blended MOOC at the University of Cape Coast (UCC), Ghana, deals with third-party MOOCs as open educational resources (OERs) integrated with campus-based courses. UCC students have been using such blended MOOCs since 2016, when all level 100 students were mandated to enrol in a course entitled Information Technology Skills (ITS101). ITS101 is aligned to courses in a MOOC platform called Alison as an OER. Students' engagement is key to their continued use and satisfaction with online learning, such as MOOCs. However, among all the e-learning modes, students' engagement is the lowest in MOOCs, leading to high dropout rates. Blended MOOCs are one of the techniques recommended to reverse the undesirability of MOOCs, including engagement. However, few studies have been conducted on students' engagement in blended MOOCs, especially among university students in sub-Saharan Africa using MOOCs as OERs. Thus, this paper aims to measure student engagement in blended MOOCs using the revised Community of Inquiry for university students in a developing country. The rationale is to determine whether factors affect engagement positively or negatively. A two-stage cluster sampling technique was used to determine the participants for this study. A list of blended MOOC classes offered at UCC was obtained from the staff's mailing list. In the first stage, academic levels (100, 200, 300 and 800) were randomly selected from the strata using a lottery sampling technique. In the second stage, another simple random selection of blended MOOC courses or classes was used within each selected academic level. All students in the selected classes were then included in the study. Partial Least Squares Structural Equation Modelling was used to validate the model on the predictive relationships existing among the four presences (cognitive, learning, social and teaching) and engagement. Results from the structural model analysis proved a statistically significant predictive relationship among the constructs within the model. Learning presence had the most significant effect on student engagement. Thus, it should be included as one of the presences in the community of inquiry.

Introduction

Envision an educational setting wherein students are actively engaged in their learning process, fostering collaboration among peers and engaging with their instructors locally and across various timeframes globally on a massive scale. This scenario is the promise of blending massive open online courses (MOOCs) with campus-based courses, which offer the potential to integrate the advantages of MOOCs, such as affordability, flexibility and accessibility, with the interactive and social components inherent in traditional campus-based face-to-face instruction (F2F) (Almutairi & White, 2018 ; Edumadze et al., 2022 ). With MOOCs, affordability deals with its fee component, which is typically free or very low-cost for certificates or specific course materials; flexibility is concerned with the self-paced, allowing learners to progress at their speed within enrolment windows or deadlines and accessibility looks at accessing MOOCs materials from any location with a stable internet connection, using a wide range of devices–phones, tablets, computers. This blended learning can bring about a transformative impact on the educational landscape of developing nations, especially at the tertiary level (Kruse & Pongratz, 2017 ), though also applicable at the secondary level (Koutsakas et al., 2020 ). Thus, MOOCs' scalability and adaptability across different educational levels are emphasised. The capacity of MOOCs to simultaneously give the same experience to tens of thousands of students breaks the pattern of conventional university education, having the potential to expand access to education and decrease educational costs (Groves, 2012 ). Increasing access has been the goal of all universities (Vieira et al., 2020 ), and any EdTech solution that “rides on the wings” of the internet can attain global reach (UNICEF Office of Innovation, 2022 ). Again, with the increasing cost of “bricks and mortar” education and the 'swallowing effect' of students' debt. As such, any mode of education that combines high-quality instructional delivery at the barest minimum cost has the potential to be embraced by education seekers. That is why MOOCs should occupy the attention of all learners, educators, teachers, administrators, and policy-makers. McNutt ( 2013 ) quotes journalist Fareed Zakaria's conversation with the prime minister of a large developing nation. This prime minister said delivering wireless internet to every region of his country would make higher education accessible. After that, he will tell students to attend free online courses from American colleges, like MOOCs, so that more people can acquire higher education. Though the Prime Minister oversimplified the solution, using MOOCs to increase access to tertiary education is an option that all must explore. Introducing teachers and students to the continuous use of innovative technology such as MOOCs is a way of leveraging technology in education, especially in cash-strapped institutions such as those in the global South, including Ghana. As a platform for online dissemination of academic content, MOOCs afford students unprecedented access to educational materials that are high-quality content and engaging.

MOOC is a method for distributing educational content via the Internet to anyone who desires to participate, with no restriction on the number of participants (Educause Learning Initiative, 2011). MOOCs are classes intended for many students (usually in hundreds of thousands) and made available anytime, anywhere, using any internet-driven device. They provide a free online educational experience, regardless of their prior entry qualification, training or education (OpenupEd, 2015 ). MOOCs open a world of educational possibilities for learners and lifelong learners alike, particularly in developing countries. Because of MOOCs' unlimited open access, a new breed of open educational resources (OERs) has emerged (Qaffas et al., 2020 ). HEIs are embracing this new breed in their on-campus courses (Kloos et al., 2015 ; Zhang, 2013 ). The embrace has resulted in blended or hybrid learning that includes any learning activity that integrates content or learning objects of MOOCs into a traditional campus-based curriculum of institutions. This educational technology integration strategy has opened avenues for HEIs, especially those from developing nations with no online learning components, to have a feel of online learning through the blend mentioned above. Through this means, students can access free but high-quality academic materials in the form of these OERs or open textbooks, which were previously very expensive or difficult to acquire. Through this integration, MOOCs represent an excellent opportunity to revolutionise blended learning across all levels of education, including organisations providing continuous professional development (Ulrich & Nedelcu, 2015 ). MOOCs also help in the classroom when licensed as the next generation of textbooks and become one of the tools a teacher uses to teach the course (TED, 2014 , 13:09). The name for the combination of MOOC and the on-campus course is currently contentious. Terms such as blended MOOC (bMOOC/ B-MOOC), hybrid MOOC (H-MOOC), wrapped MOOC and distributed flip, among others, have popped up (Almutairi & White, 2018 ; Bruff et al., 2013 ; Koller et al., 2013 ; Yousef et al., 2015 ).

Blended MOOCs come in several models, as explained by Edumadze et al. ( 2022 ), which include university-created MOOCs utilised for credit transfer and integrating other MOOCs within the curriculum. The latter can entail establishing official collaborations with MOOC providers or informal adoption where faculty and students engage autonomously. At UCC, the method follows the second informal model where external MOOCs are included in classroom instruction, either partially or wholly, to supplement or improve the standard course content. In this situation, MOOCs function as supplementary open educational resources (OERs), offering versatility in instructional approaches such as flipped classes and reference materials. This integration showcases a dynamic utilisation of MOOCs to enhance the teaching environment.

Walters ( 2014 ) suggested the following possibilities exist because of the blended MOOCs:

Unbundling content by moving away from a single source (Professor) to multiple sources (online, students, professors (when co-lecturing) and experts from the industry),

Making it possible for students to take some courses locally at their enrolled university and others as MOOCs with content from different providers (institutions and universities),

It is causing the changing of classroom spaces from large lecture halls to small learning spaces.

Universities and tutors adopt blended MOOCs for various reasons, among which are:

Making students aware of the MOOC phenomenon and trends (Holotescu et al., 2014 )

Enlarging knowledge/topics of the course, they have enrolled locally (Holotescu et al., 2014 )

Creating awareness of MOOC among lecturers.

Serving to introduce e-learning to stakeholders of HEIs that do not officially use such technology.

Introducing lifelong learning practice among students.

Serving as a way of participating in open education (i.e., open access, OER) among stakeholders.

Exposing students to teaching materials and pedagogies from other HEIs in different countries.

The benefits of blended MOOC are optimising student engagement, satisfaction, and learning (Bruff et al., 2013 ).

Notwithstanding the benefits of MOOCs, studies such as Onah et al. ( 2022 ) and Yusof et al. ( 2017 ) have identified learners' lack of engagement as a significant drawback leading to high dropout rates. Monitoring how much time students spend on learning activities promotes high-quality learning and reduces course dropouts (Hussain et al., 2018 ; Rahimi, 2024 ). Learners' engagement has been seen as the reason for creating blended MOOCs and a key to making MOOCs work. Contact North ( 2016 ) declared that educators, educationists, instructional designers, instructional technologists, and others with a stake in education should be concerned about how engaged students are in online activities. Furthermore, Contact North ( 2016 ) also made the following points:

Not all courses have high levels of student engagement, and some give students few chances to build their learning (Contact North, 2016 , p. 2).

Student engagement is the best way to predict how well students learn (Contact North, 2016 , p. 7).

Institutions should pay more attention to their understanding of instructional design, student engagement, and assessment (Contact North, 2016 , p. 7).

If we want more students to succeed, we should emphasise student engagement and learning design, and faculty should be very involved in this work (Contact North, 2016 , p. 7).

Other researchers have come to a similar conclusion on student engagement, among which are:

Student engagement is one of the best learning and personal growth predictors (Sukor et al., 2021 , p. 640).

Student engagement, learner interaction, and teacher presence explained many differences in student satisfaction and how much they thought they were learning in online learning environments (Gray & DiLoreto, 2016 , p. 1).

Different things inside and outside the classroom can affect students' engagement (Vezne et al., 2023 , p. 1866).

Students actively engaged in learning tend to do better in school (Deci & Ryan, 2000 , as cited in Vezne et al., 2023 , p. 1866). This is because they enjoy and see the value in their actions.

Student engagement strongly predicts educational results(Wang & Degol, 2014 , p. 3).

Student engagement predicts learning and school performance (Haw et al., 2022 , p. 226).

Student engagement represents a significant challenge in online learning, especially in blended MOOC environments. This challenge is not limited to specific geographies. However, it is a global concern impacting students, teachers, and academic institutions. Contact North ( 2016 ) underscores that educational institutions must critically reassess instructional design, student engagement, and assessment strategies to foster meaningful learning experiences. Garrison and Vaughan ( 2008 ) further highlight the importance of reevaluating course design in blended learning settings to optimise student engagement. These set the stage for understanding the universal relevance of student engagement in educational success.

However, the literature reveals a notable gap in the research concerning the impact of blended MOOCs on student engagement, particularly within the context of Ghana in Sub-Saharan Africa. Despite the proliferation of MOOCs globally, there is a marked scarcity of research on their effectiveness in developing countries, with these studies pointing to this gap (Almutairi & White, 2018 ; Maphosa & Maphosa, 2023 ; Mutisya & Thiong’o, 2021; Montgomery et al., 2015 ; Yunusa et al., 2021 ; Zakaria et al., 2019 ). Sub-Saharan Africa's unique challenges, such as limited internet access and diverse educational needs, present a compelling case for a tailored approach to educational technology (Maphosa & Maphosa, 2023 ). This study aims to bridge this gap by exploring how blended MOOCs can enhance student engagement and learning outcomes in such contexts. The specific focus on Ghana allows for a detailed examination of these dynamics, offering insights that are both locally relevant and globally applicable (Yunusa et al., 2021 ).

Building on the theoretical framework of the revised Community of Inquiry (RCoI), this research investigates their collective impact on student engagement in blended MOOCs. Empirically, this study addresses a critical gap by employing a robust methodological approach to explore the relationship between these presences and student engagement. This research validated a standardised scale to measure student engagement in blended MOOCs by combining instruments from Almutairi and White ( 2018 ) and Wertz ( 2022 ) leading to the standarisation of such instrument as identified as a gap for MOOCs by Deng et al. (2020). Study of this nature enhances the reliability and validity of the instrument leading to its standarisation.

Thus, this study aims to explore the impact of the revised Community of Inquiry (RCoI) framework elements—cognitive, learning, social and teaching presence—on student engagement in blended MOOCs at the University of Cape Coast, Ghana. Grounded in the RCoI framework, this research investigates the synergistic effect of its four presences on enhancing student engagement. The RCoI framework provides a robust theoretical lens through which the dynamics of engagement in blended learning environments can be examined (Garrison et al., 2000 ; Shea & Bidjerano, 2010 ). In achieving the stated aim, the following objectives will be pursued:

Analyse the Community of Inquiry (CoI) framework to identify indicators of student engagement in blended MOOCs.

Investigate the impact of the four CoI presences (teaching presence, cognitive presence, social presence, and learning presence) on student engagement in blended MOOCs.

Examine the specific contribution of learning presence within the CoI framework to student engagement in blended MOOCs.

In pursuance of the second question, the following hypotheses will be investigated:

H1: No significant relationship exists between cognitive presence and students' engagement with blended MOOCs.

H2: No significant relationship exists between learning presence and students' engagement with blended MOOCs.

H3: No significant relationship exists between social presence and students' engagement with blended MOOCs.

H4: No significant relationship exists between teaching presence and students' engagement with blended MOOCs.

Figure  1 shows the research framework for the study.

figure 1

The research framework

This research addresses a vacuum in geographical research on blended MOOCs in Sub-Saharan Africa, notably Ghana, making a significant addition to educational technology’s literature. Examining student participation in this underrepresented region illuminates its unique educational problems and potential (Almutairi and White ( 2018 ). This study's regional is also relevant since Maphosa and Maphosa ( 2023 ) emphasise the necessity for MOOC initiatives in Sub-Saharan Africa. The study explores student engagement in blended MOOCs using the RCoI paradigm, enriching the theoretical landscape. The study confirms the theoretical validity of this approach, enhancing our understanding of how different presences affect student engagement synergistically. This research validated a standardised scale to measure student engagement in blended MOOCs, filling a gap identified by Deng et al. (2020). A rigorous engagement assessment tool improves blended learning empirical research by enabling more precise and replicable results across trials.

This research has significant implications for policymaking and instructional design, providing educators and designers with actionable insights to boost student engagement in blended learning. The study helps create more engaging blended learning experiences by identifying engagement tactics. Additionally, educational policies that promote equity and access are crucial, especially in regions facing infrastructural and resource limitations (King, 2015 ; Yunusa et al., 2021 ). Finally, this study adds to a more inclusive global educational technology debate by focusing on an underrepresented region with distinct educational needs and concerns. It ensures that Sub-Saharan African students' opinions are heard in educational technology transformation talks. This inclusive approach shows the study's dedication to a more equal and accessible learning landscape worldwide, promising to inform future educational policies and build a more inclusive educational environment for all.

The community of inquiry

As a social constructivist-collaborative paradigm, Garrison et al. (1999) initially suggested the Community of Inquiry (CoI) framework. The framework was built around three different ideas called "presences": cognitive presence (CP), social presence (SP) and teaching presence (TP). The way students interact with one another while taking an online course is called their "presence." Each node in the CoI framework is a conversation happening over the internet. The three presences impact learners' inquiry-based learning experiences within a learning community. Vaughan and Garrison ( 2008 ) said that CoI is founded on two fundamental notions for significantly higher education in an era powered by internet technologies–web 2.0 and social media: 'community' and 'inquiry'. The community recognises the social dimension of education, emphasising the need for interaction, collaboration, and conversation in knowledge construction (Arbaugh, 2007 ; Ranjan, 2020 ). The process of generating meaning through personal responsibility and choice is reflected in inquiry (Arbaugh, 2007 ). Again, Ranjan ( 2020 ) defined inquiry as how students develop meaning through their initiative and selection. Thus, CoI is a theoretical description of what makes an online learning environment engaging and cognitively developing. It is a robust and comprehensive approach to teaching, learning, and assessment.

Many authors recommend expanding the CoI framework's three presences to encompass several presences: autonomy, distributed teaching, emotional, instructor, engagement, vicarious and learning (Kozan & Caskurlu, 2018 ). These presences strengthen the CoI framework (Kozan & Caskurlu, 2018 ) and better describe the online educational experience (Anderson, 2017 ; Moore & Miller, 2022 ). The learning presence is the only planned addition that has been extensively embraced. Garrison and Anderson, CoI framework creators, have varied opinions about the learning presence. Anderson supports its inclusion, while Garrison rejects it. The learning presence introduces self-directed learning and transforms COI's "teaching model" into a "teaching and learning paradigm," expanding its use beyond traditional schools and educational settings, according to Anderson ( 2017 ). Though Anderson ( 2017 ) stated that the three presences provide a parsimonious advantage, Anderson ( 2017 ) argues that including learning presence in the CoI framework benefits:

The current CoI is only helpful for building and defining an efficient teaching framework.

It brings CoI closer to the principles of autonomous learning advocated by constructivist learning and heutagogical approaches to education.

It expands the scope of the CoI from purely pedagogical to one that incorporates learning, making it applicable in settings beyond the classroom. It has the learning potential, as demonstrated by the students.

Garrison ( 2017 ) argues against including learning presence in the CoI framework:

It contradicts the concept of a collaborative inquiry community. CoI members are supposed to have varied degrees of presence in the three areas. The teacher and learner are one in this scenario. Individuals have distinct responsibilities as teachers and students (Garrison, 2022 ).

Teaching and social presence were most related to learning presence, and cognitive presence emerged at the junction of these presences.

Adding more presences may increase framework complexity and violate parsimony principles.

The suggested presences can be incorporated into the model by expanding the scope of the three presences' definitions or the interrelationships between and among the three presences (Kozan & Caskurlu, 2018 ).

Shea ( 2010 ) suggested adding a "learning presence" to the CoI architecture. The suggestion came after observing that learner discourse—what students did—did not fit the then CoI model. Thus, Shea could not correctly code the identified discourse as social, cognitive, or teaching presences (Shea, 2010 ). Since online learning is electronic, social, and "self-directed," it was vital to study how students self- and co-regulate environments (Shea & Bidjerano, 2010 ). To do this, Shea and Bidjerano ( 2010 ) investigated various factors, such as the metacognitive abilities, motivational states and behavioural management strategies employed by successful online students. The lack of a learning presence that deals with the online learner role was observed as one of the limits to the CoI model (Shea & Bidjerano, 2010 ). The four presences of the RCoI framework are discussed in the subsequent sections.

Cognitive presence

Cognitive presence refers to the degree to which MOOC participants actively participate in substantive discussions and engage in critical thinking. The defining features of this phenomenon entail learners engaging in the exchange of ideas and perspectives, expanding upon one another's contributions, and critically examining their own and others' thought processes. With cognitive presence blended, MOOC participants are to participate in substantive discussions and engage actively in critical thinking. Cognitive presence significantly promotes learners' active participation and directly impacts their academic performance and overall satisfaction (Kang et al., 2007a , 2007b ).

Social presence

Social presence refers to the degree to which learners experience a sense of connection and belongingness with their peers and the instructor within blended MOOCs. The phenomenon is also marked by perceiving support from their peers and perceiving their ability to contribute to the learning process actively. It is widely recognised as crucial in establishing a conducive learning environment that fosters support and collaboration. A positive correlation exists between social presence and students' level of engagement in their academic pursuits and within the educational institution (Alabbasi, 2022 ). Social interactions among students contribute positively to their social integration and attitude towards the subject and foster a competitive learning environment (Damm, 2016 ). Finally, Miao and Ma ( 2022 ) observed that social presence directly impacts engagement and mediates the association between self-regulation and engagement.

Teaching presence

Teaching presence refers to the degree to which the instructor establishes a nurturing and interactive educational setting within a MOOC. Teaching presence encompasses the deliberate planning, facilitation, and guidance of cognitive and social activities to achieve meaningful and valuable learning outcomes. Empirical research has provided support for the positive impact of teaching presence on various learning outcomes, such as perceived learning, learner satisfaction, and behavioural engagement (Caskurlu et al., 2020 ). The existence of effective teaching plays a crucial role in fostering student engagement and influencing the outcomes of their learning experiences (Zhang et al., 2016 ).

Learning presence

The blended MOOC that this research is based on has the traditional roles of teachers and students assigned. However, it subscribed to the student-centredness of MOOCs, hence the inclusion of the learning presence in this study. Learning presence refers to the degree to which learners actively acquire knowledge and skills and assume accountability for their learning. The phenomenon under consideration is distinguished by the active involvement of learners in establishing objectives, monitoring their advancement, and actively seeking out educational resources to facilitate their learning process. Learners actively acquire knowledge and skills, ultimately attaining the intended educational objectives. Learning presence deals with active participation in the educational process and achieving intended educational objectives.

At first, "learning presence" represented academic self-efficacy and other cognitive, behavioural and motivational factors contributing to online students' capacity to self-regulate their learning (Shea & Bidjerano, 2010 ). Researchers have found that self-efficacy and self-regulated learning are strongly linked to other important factors for learning success. Shea and Bidjerano ( 2010 ) thought that self-efficacy and self-regulation were important parts of the learning presence. Self-efficacy and self-regulatory skills both affect e-learning success. However, they also affect each other in a learning environment. Doo and Bonk ( 2020 ) found that self-efficacy did not affect learning independently but helped self-regulated learning. Later, "learning presence" indicated self-regulated learning within a community of inquiry (Jimoyiannis & Tsiotakis, 2017 ; Shea & Bidjerano, 2012 ). Statistics show high relationships between self-efficacy and self-regulatory scores in online and traditional learning environments, such that high self-efficacy and positive self-regulation are reliable predictors of academic success in online courses(Bradley et al., 2017 ).

Zimmerman ( 2000 ) defines SRL as "self-generated thoughts, feelings, and actions that are planned and cyclically adapted to personal goals" (p. 14). SRL theorists consider meta-cognition, conduct, and motivation (Zimmerman, 1986 ). SRL strategies are "actions and processes directed at acquiring information or skill that involve agency, purpose, and instrumentality perceptions by learners" (Zimmerman, 1989 , p. 329). Examples are goal setting, time management, organisation, self-monitoring, and strategy adjustments. E-learning requires these skills because students’ study independently and at their speed. These variables help learners use e-learning materials. They can affect e-learning learners' motivation, engagement, persistence, and success. Self-regulated learning involves planning, monitoring, controlling, and regulating learning. Studies by Cho and Shen ( 2013 ) have shown that self-regulated learning (SRL) is essential for academic success in online learning environments. It directly contributes to students achieving their learning goals, especially in contexts with high student autonomy and minimal instructor presence.

Beyond self-regulation, co-regulated learning recognises that learning is often social and involves contact with others (Andrade et al., 2021 ). Assessment today involves co-regulation of learning through interactions with students, teachers, peers, and technology. Co-regulation occurs dynamically in online collaborative learning and may be detected and evaluated to improve learning design (Andrade et al., 2021 ). A study conducted by Liao et al. ( 2023 ) found that learning presence, specifically self-regulated learning (SRL) and co-regulated learning (CoRL), is a significant predictor of student engagement in blended learning. The researchers observed that CoRL is more strongly associated with emotional engagement, while SRL is more closely linked to cognitive engagement.

Students' self-efficacy is confidence in their ability to achieve in certain situations or tasks. Self-efficacy is "people's beliefs about their abilities to produce designated levels of performance that influence events that affect their lives" (Bandura, 1994, p. 71). With this, individuals cognitively appraise their capability to effectively perform and achieve desired outcomes in particular circumstances or tasks. Students with positive and comparatively high self-efficacy beliefs are more inclined to exhibit engagement in the classroom, as evidenced by their behaviour, cognition, and motivation (Linnenbrink & Pintrich, 2003a , 2003b ). According to Zhang ( 2022 ), self-efficacy significantly enhances learner engagement. Self-efficacy is a significant cognitive factor influencing motivation and engagement (Schunk & Mullen, 2012 ). Zhang ( 2022 ) revealed that self-efficacy emerged as the sole significant variable influencing learner engagement. Azila-Gbettor et al. ( 2021 ) conducted a study revealing that self-efficacy and autonomous motivation positively impact peer and intellectual engagement. Since self-efficacy, self-regulated learning and co-regulated learning are significant predictors of student engagement. Thus, learning presence predicts student engagement.

  • Students’ engagement

One of the most important goals of e-learning environments in higher education is to get students more engaged in learning. This engagement is done through continuous interaction that builds cognitive and non-cognitive skills for success in school (Ituma, 2011 ). However, online learning such as MOOCs has considered problems inherent in retention and engagement (Lambert & Fisher, 2013 ). Thus, student engagement is critical for student retention and satisfaction in online courses. Fredricks et al. ( 2004 ) argue that within the research community, there remains a requirement for consensus about the definitions, frameworks, and constructs of engagement. Researchers employed a variety of indicators to predict learning outcomes and evaluate student engagement in various contexts. The analysis of indicators related to engagement is undertaken from a singular standpoint, operationalised through the participation of students in diverse activities. As an illustration, Wang et al. ( 2015 ) quantitatively analysed engagement levels within online discussion forums. Whitehill et al. ( 2015 ) employed measurements to assess the extent of video lecture viewership.

Furthermore, researchers have made numerous attempts to evaluate student engagement in addition to the abovementioned measures. For example, certain research investigations have employed self-report instruments, such as surveys or questionnaires, to collect data on students' subjective evaluations of their levels of engagement (Joksimović et al., 2018 ). Other studies have employed observational methodologies to directly assess students' behaviour and engagement levels in a classroom setting. However, some researchers have blended these approaches to acquire a more holistic comprehension of student engagement.

Student engagement in online courses refers to the degree to which students actively participate in critical thinking, verbal communication, and interaction with course materials, fellow students, and the instructor. Student participation in the learning process encompasses their engagement and collaboration with the instructor and their peers, indicating their level of involvement (Dixson, 2015a , 2015b ). Hodges ( 2018 ) defined engagement as assessing one's level of involvement, enthusiasm, and commitment to a company. Engagement is described as energy in action, the connection between a person and the exercise they perform for a stated goal in academic settings. So, the student's active participation in a task or activity is vital for engagement. Gallup's 2018 study, 'School Engagement Is More Than Just Talk,' discovered that:

Engaged children are 2.5 times more likely to report receiving outstanding grades and performing well. They are 4.5 times more likely than their actively disengaged peers to be optimistic about the future (Hodges, 2018 ).

Students engagement in the community of inquiry

Students' engagement is key to the success of the CoI model. One of the originators of the CoI framework and a colleague (Garrison & Vaughan, 2008 ) made the following statements attest to that fact. The efficacy of the inquiry technique relies on the presence of engagement; engagement is crucial for a community of inquiry and the overall higher educational experience (p. 16). The educational process within the community of inquiry entails both the public and private domains; engagement within a community of inquiry refers to the convergence of these public and private realms (p. 16). The inquiry approach in education encourages students to engage in responsible learning activities (p. 112) actively.

The CoI framework offers a distinct paradigm for identifying student engagement. It extends beyond mere assessment of engagement and explores the calibre of engagements, the extent of analytical reasoning, and the whole educational experience for students. The CoI framework assesses the instructor's ability to lead the course effectively (teaching presence), the students' interaction and contribution to the community (social presence), the students' active thinking and comprehension of the material (cognitive presence), and the students' proactive approach to managing their learning and supporting their peers (learning presence). The simultaneous collaboration of these four presences signifies a notable degree of student engagement. If any of these elements are lacking, it can identify areas where instructors can enhance their teaching tactics or the course design to cultivate a more captivating learning environment.

CoI for online learning can make students' engagement easier (Oyarzun & Morrison, 2013 ). In online education, a community of inquiry promotes “epistemic engagement “(Shea & Bidjerano, 2010 ). Epistemic engagement involves actively engaging with knowledge to deepen comprehension, thus making online learning environments more successful. The implication is that students are actively engaged in acquiring knowledge, employing critical thinking skills to analyse the subject matter, and actively participating in discussions with their peers and instructors to foster a comprehensive comprehension of the material. Students' engagement is vital for discovering knowledge to make learners active and instil lifelong learning capabilities in the digital era characterised by abundant information and learning initiatives, including MOOCs.

Choo et al. ( 2020 ) support this notion, highlighting the framework's usefulness in measuring engagement. Damm ( 2016 ) also confirms the effectiveness of the CoI survey in measuring engagement within MOOCs; nevertheless, the author acknowledges a limitation: the survey cannot definitively pinpoint the cause of low engagement (e.g., is it due to a lack of strong peer interaction?). Despite such limitations, a study by Das and Madhusudan (2023) assessed the CoI model's ability to promote collaborative learning and enhance engagement. The study further indicated that the CoI model significantly contributes to learner engagement, fosters collaborative learning, and improves learner performance across cognitive, emotional, and behavioural domains. Building on these findings, Ginting ( 2021 ) suggests further optimising the presences to enhance student engagement on online platforms.

Research methodology

The research methodology refers to the systematic approach and techniques employed in conducting a study. It encompasses the overall design and data collection. The present study used a quantitative research methodology to examine the RCol presences that impact student engagement within blended MOOCs serving as an open educational resource (OER).

Participants

Most respondents said they participated in blended MOOCs as the first-year University of UCC students. They enrolled on the Alison MOOC platform for Microsoft Office 2010 -Revised 2018 as OER for a campus-based course entitled ITS 101, which is mandatory for all level 100 (first-year) students. They watched the videos on the platform, participated in both the discussion forums and answered the quizzes to obtain a certificate. The marks on their certificate constituted part of their continual assessment for the campus course in which they enrolled. The continued students at the upper levels (200–900) registered unto other MOOC platforms like Saylor.

The research sample comprised 2875 students at the University of Cape Coast(UCC), Ghana, actively engaged in a blended MOOC centred on multiple courses. The total student population, which is the target population as of 2022–2023, was 60,243 (see Table  1 ), comprising 54,236 undergraduate and 6007 post-graduate students.

Only the regular group is involved with blended MOOCs, making them the target population for the study, which is 26,527 regular students. Table 2 shows the distribution of the students in the regular group.

Furthermore, 25,239 students from Table  2 are the research population. Except for students from levels 50, 250, 850, 900 and 950, most of these students have completed ITS 101: Information Technology Skills course at UCC. This course is a semester course based on blended MOOC instructional delivery. ITS101 is a mandatory course for all level 100 students at UCC. Some 25,239 students have again completed additional blended MOOC courses at levels 200–800. The blended MOOC was an open educational resource (OER), enabling unrestricted access and completion without associated costs. Lecturers selected MOOCs that fit their on-campus courses and instructed students to enrol. Upon completion, the marks students obtained from the MOOCs become part of the continuous assessment for their on-campus-based courses. Lecturers further organised supervised formative assessments on campus.

Sampling method

The sampling method selects a subset of individuals or items from a larger population to conduct research. The researchers employed a two-stage cluster sampling technique to determine the participants for this study. A list of blended MOOC classes offered at UCC was obtained from the staff's mailing list. Thus, the population was stratified by academic or year level (100–800). These levels 50, 250, 850, 900 and 950 were excluded as none of their courses uses blended MOOCs as an instructional format. In the first stage, academic levels were randomly selected from the strata using a lottery sampling technique. Each level of the population was assigned a level number beginning with 100. A single random sampling technique was used to select the levels participating in the research. The levels 100, 200, 300 and 800 were thus picked. In the second stage, another simple random selection of blended MOOC courses or classes was used within each selected academic level. All students in the selected classes were then included in the study.

Data collection

The Google survey questionnaire form was used to design the questionnaire and sent to all students within the stated categories (see Table  3 ) through their institutional email addresses to fill out—an estimated duration of 20 min for participants to complete the questionnaire. Tables 3 and  4 shows the level and class size of courses selected where students completed a blended MOOC course at UCC. Students from this population qualify to participate in the research study. Thus, the sample population shall be drawn from the accessible population. The benefit of using the Google Form approach is that all students have institutional mail based on the Google platform. There was a link to the survey in the email invitation. An introductory letter stating the purpose of the study and guaranteeing the student’s anonymity was added. It further explained to the students how the data would be used for academic purposes only, and they could decline it if they wanted to. They were also informed verbally and encouraged to respond to the questionnaire during classroom sessions. Reminder emails were sent bi-weekly to non-responders after two weeks following the initial survey invitation. The survey was open for the whole semester, and all responses were kept confidential.

Data analysis

After completing the data collection process, the responses to each research question were scrutinised and pruned to ensure that they aligned with the stated instructions. For each research question, the researcher gave the response a numerical code uploaded into SPSS version 28. A missing values analysis was then performed to ensure that missing data did not render the rest invalid. Later, a descriptive analysis was used to assess data distribution normality. The rationale is to know whether a parametric or non-parametric analysis approach is based on the data's skewness. The model is composed of four research questions. The researcher analysed it by converting the data to a comma-separated values file (CSV) and uploading it in SmartPLS 4 software for a partial least square regression (PLS). The approach employed in this study involved two key steps. Firstly, the variables were condensed into a reduced set of components to enhance manageability. Secondly, the analysis was performed on these condensed components instead of conducting a least-squares regression analysis on the original data. The Partial Least Squares (PLS) algorithm employs a methodology akin to principal components analysis to reduce the number of variables. This reduction is achieved by extracting components that capture the strongest correlations among the determinants (Hair et al., 2019 ). The methodology employed in this study involves utilising various components as variables, with the aid of cross-validation, to determine the smaller components that exhibit the highest level of predictive capability (Helland et al., 2018 ). Data were analysed using SPSS version 28 and SmartPLS version 4. Descriptive statistics were employed to comprehensively describe the sample and Community of Inquiry (CoI) dimensions. The researchers employed structural equation modelling (SEM) to examine the postulated associations between Community of Inquiry (CoI) and student engagement dimensions.

Limitations

The present study is subject to several limitations. The study was conducted within a single university (UCC) in Ghana. The generalisability of the study's findings to other settings may be limited. Furthermore, the investigation was carried out utilising a self-report questionnaire. This implies the existence of a potential for social desirability bias. Moreover, it is essential to note that the study employed a cross-sectional design (i.e., collecting data from multiple subjects at a specific moment), limiting its ability to establish causal relationships between the elements of conflict of CoI and student engagement. The study did not investigate additional variables that could impact student engagement in blended MOOCs, such as pre-existing knowledge or motivation.

Ethical considerations

The topic of ethical considerations is of paramount importance in academic discourse. It is crucial to carefully examine and address the ethical implications associated with it. Since this paper is extracted from the PhD dissertation of the lead author, the research was approved by the Institutional Review Board (IRB) of the University of KwaZulu-Natal, South Africa. The participants were provided with information regarding the study and provided their explicit consent to partake in the research. The data was gathered to ensure anonymity and stored in a secure facility. The participants were duly notified that their involvement in the study was entirely voluntary, and they were assured that they had the freedom to withdraw at any time without seeking permission from the researcher.

Response rate

The average response rate for research surveys is very different depending on the type of survey and the audience being surveyed. A meta-analysis of online surveys from various fields showed an average response rate of 39.6% with a standard deviation of 19.6% (Wu et al., 2022 ). The statement means that, on average, 39.6% of the people asked to participate in these online surveys did so. The standard deviation (SD) of 19.6% shows that the response rates to online surveys differed from one study to the other, running from as low as 20% to as high as 60% in some cases. Saunders ( 2014 ) also said that a response rate of around 30% in a random group is considered good. As indicated in Tables 3 and 4 , the researcher sent the questionnaire to all the 3506 students identified in the accessible population. A link to the same questionnaire was added to the Moodle LMS pages for courses that use blended MOOCs to teach. This mailing list comprises 3506 students who have used blended MOOCs at UCC for at least one semester. However, of the 3138 who completed the questionnaire, 2875 students filled it out successfully without missing data. The response rate was 82%, which is adequate for the study.

Demographic characteristics

Table 4 shows the demographic and biographical data of the respondents. EdTech research depends on these bio-demographic data, which is crucial for designing, implementing and evaluating technology interventions. They could help find subpopulations that need more complicated and subtle treatments, eliminate confusing factors and make policy decisions about using technology in the classroom. In a blended MOOC study, including demographic data, like sex, may assist researchers in determining whether gender sex affects learning. This information helps to ensure that instructional technology is sex-neutral and easy to use. Studies suggest sex inequalities in educational technology usage and acceptance. Several studies indicate that males are more tech-savvy than women (Irene, 2019 ).

In educational research, sorting is often used to tell the difference in perception and use of EdTech between students in STEM programmes and those who are not. The grouping is based on the idea that STEM and non-STEM students have different educational needs and experiences. Lin et al. ( 2021 ) reported that the factors affecting students’ intentions to continue using the platforms differed between STEM and non-STEM courses, with perceived usefulness being more important in STEM courses and perceived ease of use being more critical in non-STEM courses. Another study by Alkhalaf and Nguyen ( 2020 ) also showed that EdTech positively affects student learning outcomes in STEM and non-STEM courses. However, the effect was more significant in STEM courses than in non-STEM.

It is essential to consider the academic year of the respondents when analysing the data on satisfaction, academic performance and other variables related to blended MOOCs, as it may impact their experiences and perceptions of the technology. Additionally, understanding how experience with blended MOOCs varies by academic year can inform future design and implementation of such technologies. Based on the data, most respondents (85%) enrolled in blended MOOCs at level 100, indicating they were in their first year of study when they registered. This result suggests that most respondents were relatively new to blended MOOCs and may have needed more experience with them than more advanced students. Additionally, the small number of respondents at higher levels (300, 500, and 800) suggests that blended MOOCs may be less commonly used or required at higher levels of study.

Measurement model analyses

Two stages are involved in the analysis of PLS-SEM. The first is to analyse the validity and reliability of the measurement model. After successfully passing the validity and reliability tests, the subsequent step involves the analysis of the structural model, as Hair Jr et al. ( 2021a ) outlined. The data acquired from the study were subsequently analysed using the Partial Least Squares Structural Equation Modelling (PLS-SEM) technique in SmartPLS version 4.0.8.1. A path model is a graphical representation that illustrates the hypotheses and relationships between variables in a structural equation modelling (SEM) analysis, as Bollen (2002, as cited in Sarstedt et al., 2021 ) described. A path model includes structural and measurement models. In PLS-SEM, the outer models are measurement models, while the inner models are structural. The measurement model deals with the individual survey items and respective latent variables (constructs) measured by the former. The structural model shows the cause-and-effect relationships that deal with the latent variables and their linking relationships. Hair et al. ( 2021a ) suggested evaluating the elements discussed in the section to assess the study's measuring model.

Internal consistency

Internal consistency is usually judged by how items on the same construct relate to each other. It checks whether there is a link between the scores on different items meant to measure the same construct. It is also called internal reliability or internal consistency reliability.

Cronbach's alpha

Researchers can use internal consistency measures such as Cronbach's alpha, split-half, or test–retest reliability. These measures assess the consistency of responses to items or indicators over time or across different forms or versions of the same test. Cronbach’s alpha has been used for ages to check the reliability or consistency of each construct within the model. Cronbach’s alpha (α) is a simple way to determine a score’s reliability. It is used when more than one item measures the same underlying concept. Cronbach's alpha measures internal consistency and shows how closely related questions are as a group for a construct or latent variable (Ravinder & Saraswathi, 2020 ). The alpha value depends on how many indicator items are, how similar they are, and how many dimensions they have. The Cronbach’s alpha results should be between 0 and 1, but they can also get negative numbers. According to Tavakol and Dennick ( 2011 ), Cronbach’s alpha has certain limitations: scores with many items with lower reliability are generally associated with decreased accuracy. The Cronbach's alpha estimates for the constructs are presented in the fourth column of Table  5 . The Cronbach's alpha estimates varied between Student Engagement (0.861) and Learning Presence (0.943), above 0.722, passing the minimum threshold and making the items appropriate for each construct.

Composite reliability

Jöreskog and Sörbom's ( 1995 ) composite reliability(rho_C) is a statistical measure used to assess the internal consistency of scale items. It serves a similar purpose as Cronbach's alpha, as Netemeyer et al. ( 2003 ) discussed. It considers the reliability of a set of items when loaded on a latent construct. Composite reliability thresholds are still a matter of contention, with varying recommendations from researchers. A reasonable threshold can be anywhere from (Nunnally & Bernstein, 1994 ; Shivdas et al., 2020 ). The complexity of a construct is very variable concerning its number of components. Fewer items on a scale imply poorer reliability, with more items producing improved reliability. Within the realm of exploratory research, it is generally accepted that composite reliability ratings within the range of 0.60 to 0.70 are deemed acceptable; furthermore, ratings falling between 0.70 and 0.90 are considered indicative of an adequate level of reliability. Composite reliability scores above 0.90 (significantly above 0.95) show that some indicators are the same, hurting construct validity (Diamantopoulos et al., 2012 ). The composite reliability estimates for the constructs are presented in the sixth column of Table  5 . The composite reliability estimates varied between Student Engagement (0.894) and Learning Presence (0.95). The estimated values for the composite reliability of each construct exceeded the recommended minimum threshold of 0.70(Fornell & Larcker, 1981 ; Hair et al., 2021b ).

Reliability coefficient

Cronbach's alpha underestimates how reliable both latent variable scores are, while composite reliability overestimates their reliability (Dijkstra & Henseler, 2015 ). According to Hair et al. ( 2021b ), it was observed that Cronbach's alpha tends to yield conservative reliability estimates, while composite reliability tends to produce more liberal estimates. However, the actual reliability of a construct typically falls within the range between these two extremes. Hence, the reliability coefficient(rho_A) generally falls within the spectrum encompassing Cronbach's alpha and the composite reliability. The (rho_A) value can be between 0 and 1; the more reliable an item scale is, the higher the rho_A value. Higher rho_A values show more reliable item scales. The rho_A value of 0.7 is the bottom limit of adequacy (Ahmad & Hussain, 2019 ; Prasetyo et al., 2022 ). The results of the reliability coefficients of each construct are shown in the fifth column of Table  5 . By inspection, rho_A values range from Student Engagement (0.862) to Learning Presence (0.943). All the constructs met the recommended threshold, indicating that the values for reliability were significant and acceptable.

Construct validity

Construct validity is how well each item indicator assesses its intended idea or construct. Assessing construct validity is especially important when researching latent constructs, which cannot be directly measured or observed; thus, measurable indicators are needed. Convergent and discriminant validity are used to examine construct validity, and when both prerequisites are met, a test has construct validity. Convergent validity measures how similar indicators relating to the same construct are identical. In such a case, the indicators should have a strong correlation. Discriminant validity tests whether, theoretically, unrelated constructs have unrelated indicators. In such a case, the indicators should have no or weak correlation.

Convergent Validity

Convergent validity refers to the extent to which the indicators of a particular concept exhibit consistent alignment in their measurements. The process involves elucidating the differences among the various items. Examining the outer loadings of the different items and calculating the average variance extracted (AVE) allowed us to check the convergent validity of the constructs.

Outer loadings

When the outer loadings of the indicator items that measure a construct are high, the items that make up that construct have a lot to share in common (higher commonalities). This situation is termed indicator reliability. According to the recommendation by Hair et al. ( 2017 ), loadings of 0.708 or above are considered statistically significant. When the indicator's loading is above 0.708, the construct accounts for more than 50% of the indicator's variance, signalling that the indicator has sufficient item reliability (Sarstedt et al., 2021 ). Though enormous studies have suggested that outer loadings of 0.50 indicate low but significant reliability (Afthanorhan, 2013 ; Hair et al., 2019 ; Hulland, 1999 ), the researcher eliminated all items that showed outer loadings lower than 0.70 during the initial analysis because they were less significant, as per Hair et al. ( 2017 ). The removed items were coded as CEBE 1, 2, 7, 8, 9, 10, 11, 12, 13, 14, 15; CPTE1, 2; CPR3, LPER4, LPMSR10, LPSEL1, ME1, 5, 6, and 7. Thus, the remaining items showed item loadings ranging from 0.701(TPDO1) to 0.798(CPI3), as presented in the third column of Table  5 . These values indicate that the remaining items had significant indicator reliability and were included in the main study.

Average variance extracted

The average variance extracted (AVE) shows the variance of each indicator item explained by its construct. AVE assesses the extent to which the variance observed in a construct can be attributed to the construct rather than measurement error. To calculate AVE, square each indicator's loading and calculate the average. AVE for each construct is calculated by summing the squares of the standard errors of the indicator variances and then dividing by that amount. An acceptable value for AVE is 0.50 (Fornell & Larker, 1981). The seventh column of Table  5 shows that the measurement model's AVE values range from 0.546 to 0.609. The observation is that they were all more than the minimal value, making them acceptable. The indicator's outer loadings and AVE both pointed out that the remaining components of the measurement model possessed substantial convergent validity.

Discriminant validity

Discriminant validity pertains to the degree to which a latent variable exhibits dissimilarity from other latent variables and, thus, represents a phenomenon that other latent variables do not represent (Yeboah, 2020 ). Discriminant validity, sometimes called "divergent validity," looks at whether or not two things that are not supposed to be related are not. The discriminant validity method determines how unique the constructs being looked at are. The correlation between two ideas that do not go together will likely be weaker than the correlation between two things that go together (Nikolopoulou, 2022 ). In Smart PLS, there are three ways to determine if a discriminant is valid. These use a) cross-loadings, b) Fornell and Larcker's criteria, and c) the heterotrait-monotrait (HTMT) correlation ratio.

Cross-loadings

Cross-loadings are when an item significantly affects not just one but other factors. Items that load on two (or more) factors or a different factor than intended are said to have cross-loadings. The cross-loading principle posits that an item's loadings on its parent construct should exhibit greater magnitude than on any other research construct. Assume an item loads more on a distinct construct than its parent construct. It shows that the construct is not discriminatory and must be validated and improved to meet the standard. According to Yeboah ( 2020 ) and Yeboah and Nyagorme ( 2020 ), measurement items have significant convergent validity when their cross-loadings are estimated at least 0.708 and are higher on their respective constructs than their loadings on other constructs. As shown in Table  6 , the items used in the study that remained after removing those with lower outer loadings had the highest cross-loadings on their respective constructs (bolded) than other constructs. They were all greater than the minimum recommended value. Thus, the instrument was found to have significant discriminant validity and was suitable for the study.

Criteria of Fornell and Larcker

Based on this criterion, it is required that the correlation between a construct and other constructs should have values smaller than the square root of the average variance extracted by the construct. The Pearson correlation refers to the correlation coefficient that quantifies the relationship between the relevant item indicators in this measurement. According to Fornell and Larcker's criterion, establishing discriminant validity is contingent upon fulfilling a specified condition. The researcher evaluated the instrument's discriminant validity using the Fornell–Larker criterion presented in Table  7 . According to Yeboah ( 2020 ), the square root of the average variance extracted (AVE) for each construct, as indicated in the main diagonal, is anticipated to exceed the corresponding values in the vertical direction. The data presented in Table  7 suggest that the measurement model successfully meets the Fornell–Larker criterion.

Heterotrait-Monotrait (HTMT) ratio of correlation

The Heterotrait-Monotrait Ratio of Correlations (HTMT) is a metric utilised to evaluate the discriminant validity within the context of structural equation modelling (SEM) employing the partial least squares (PLS) approach (Henseler et al., 2015 ). The HTMT is grounded in the multitrait-multimethod matrix framework, which involves the examination of correlations between indicators of distinct constructs (heterotrait) and correlations between indicators of the same construct (monotrait) (Henseler et al., 2015 ). The HTMT is computed by dividing the mean of the heterotrait correlations and dividing it by the mean of the monotrait correlations, as Henseler et al. ( 2015 ) described.

The HTMT method offers certain benefits compared to alternative ways of evaluating discriminant validity, such as the Fornell–Larcker criterion and cross-loading analysis (Henseler et al., 2015 ). The HTMT method does not assume tau-equivalent measurement models, which are improbable to be applicable in most empirical research endeavours (Henseler et al., 2015 ). The HTMT measure proposed by Henseler et al. ( 2015 ) offers a more straightforward and intuitive approach to assessing discriminant validity. It accomplishes this by comparing the magnitude of relationships between constructs with the magnitude of relationships within constructs. According to Ringle et al. ( 2022 ), when the HTMT value falls below 0.90, it indicates the presence of discriminant validity between two constructs. The inference can be made that an HTMT value exceeding 0.90 suggests a deficiency in discriminant validity. However, it is recommended that researchers employ a threshold of 0.85 for the HTMT when there are substantial differences in the path model structures in terms of conceptualisation (Henseler et al., 2015 ). Per the cut-off value of 0.90 for the HTMT ratio defined by Henseler et al. ( 2015 ), the values shown in Table  8 are statistically significant. The HTMT values found for the constructs show that each construct in the model differed enough from the others and measured different characteristics. So, the measuring model could tell the difference between the two groups. Consequently, the measuring model successfully demonstrated discriminant validity.

Multicollinearity

When two independent predictors are highly correlated, we have a problem known as collinearity. Collinearity means that two predictors are linked linearly. Multicollinearity is a problem in multiple linear regression models when two or more independent variables (predictors) are highly correlated, meaning that the predictors can make accurate linear predictions about one from the other. Condition indices and variance inflation factors (VIFs) help find multicollinearity (Lindner et al., 2020 ). There are several approaches to deal with multicollinearity, such as a) deleting one or more independent variables from the fit, b) performing a main components regression, and c) removing variables having strong partial correlations with other variables (Lindner et al., 2020 ). Feldman ( 2018 ) suggested these general rules of thumb: a) there is no multicollinearity among the factors if VIF = 1, b) there is moderate multicollinearity if 1 < VIF < 5 and c) there is high multicollinearity, indicating much overlap if VIF >  = 5.

Thus, a high VIF value means a construct is strongly linked to other constructs, making it hard to estimate and understand the coefficients. The VIF values of the model are shown in Table  9 .

The highest VIF number in the table is 2.252, which is below the standard threshold of 5 for finding high multicollinearity. This shows that there is no big problem with the model's constructs as they are not too similar. The lowest VIF number in the table is 1.000, meaning no multicollinearity exists between E and SP. The implication is that E and SP are two different constructs with no shared variance.In general, whereas social presence seems unrelated to other factors, cognitive presence, learning presence, and teaching presence show a modest level of multicollinearity. Nevertheless, Kock ( 2015 ) establishes that VIF values should not exceed a minimum threshold of 3.3. Thus, all VIF values from Table 9 were below Kock's ( 2015 ) suggested threshold, indicating that the estimated model was free from multicollinearity.

Discussion of the analysis of the structural model

It is necessary to conduct a structural model analysis to evaluate the hypothesised paths inside the calculated model for statistical significance. The methodology employed in this study involved a bootstrapping sequence consisting of 5000 resamples conducted using the Partial Least Squares Structural Equation Modelling (PLS-SEM) software. Figure  2 and Table 10  show what happened when the bootstrapping method was used.

figure 2

Diagram of the path analysis (path diagram) using SmartPLS

The community of inquiry (CoI) model by Garrison et al. ( 2000 ) and later extension from Shea and Bidjerano ( 2010 ) known as the revised community of inquiry (RCoI) emphasises the importance of teaching, social, cognitive and learning presences to a) promote meaningful learning engagement in online environments and b) understand and design effective online learning environments, including blended MOOCs. Students’ engagement has many dimensions. Since the study deals with blended MOOCs, the item indicators for engagement were adapted from the blended MOOC engagement model (Almutairi & White, 2018 ). In this part, the researchers discuss how four presences  of RCoI are linked to student engagement with the help of the path diagram depicted in Fig.  2 .

Teaching presence and students' engagement

H1 : Teaching presence (TP) will positively impact students' engagement (SE) in the blended MOOC system.

The relationship between teaching presence (TP) and student engagement (SE) in blended MOOCs is complicated. Although TP has been demonstrated to improve SE, the magnitude of this effect depends on context and demographics. This idea is significant because it helps us understand how TP affects SE in blended MOOCs.

The statistics for the relationship between TP and SE are β  =  0.109, t  =  5.574  >  1.96 for α  =  0.05, p  =  0. 000, CI (0.072, 0.149), f 2  =  0.015 .

The Cohen’s effect size ( f 2 ) is classified as: a) 0.00 ≤  f 2  < 0.20 as Negligible; b) 0.20 ≤  f 2  < 0.50 as Small; c) 0.50 ≤  f 2  < 0.80 as Moderate and d) 0.80 ≤  f 2 is as Large (Cohen, 1988 ; Hair et al., 2017 ). Little is known about the effect of teaching presence on student engagement in blended MOOCs. However, the empirical evidence from this study found a positive relationship between TP and SE, with a path coefficient of β = 0.109 and statistical significance at p = 0.000, supporting the idea that TP is a determinant of SE in blended MOOCs (Littler, 2024 ). Although statistically significant, the impact size ( f 2  = 0.015) is negligible, explaining only 1.5% of SE variation(Hair et al., 2017 ). This result suggests investigating other variables that may moderate or mediate TP and SE.

This result aligns with the community of inquiry framework, which emphasizes the importance of TP, alongside social and cognitive presence, in fostering a rich educational experience (Garrison et al., 2000 ). The implications of these findings are significant for the design and delivery of blended MOOCs. They suggest that TP, which includes the design, facilitation, and direction of other presence or processes to support learning (Anderson et al., 2001 ), is a key factor in promoting student engagement. This is consistent with research suggesting that TP influences learning persistence in MOOCs (Jung & Lee, 2018 ). Based on the blended MOOC setting, teaching presence (TP) can affect student engagement (SE) differently. Course material, teaching tactics, and platform features can affect TP effectiveness. TP may significantly affect SE in courses requiring more cognitive presence (Cui et al., 2024 ; Pakula, 2024 ; Su, 2023). TP may affect SE differently depending on the MOOC's discipline. STEM and non-STEM courses have varied material and student engagement so that the TP may be perceived differently. TP effectiveness can also be affected by instructional design and student variables like age, level, culture, and online learning experience. These differences show that TP may not predict SE in all learning circumstances. TP can alter SE depending on student age, education, and culture. These aspects must be considered when evaluating data and exploring how TP can meet varied learner needs (Agarwal, 2021 ). The clarity of instructional objectives, content arrangement, and interactive components of the MOOC can increase or decrease TP's influence in SE (Agarwal, 2021 ; He et al., 2023 ). Well-designed TP-aligned courses may have a more significant impact on SE (Agarwal, 2021 ). According to the study, various characteristics may mediate or moderate TP-SE. Future research should examine mediators like cognitive and social presence and moderators such as student motivation and self-regulation (He et al., 2023 ; Su et al., 2023). Even though the effect size is minimal, the positive link between TP and SE emphasises its importance in blended MOOCs. For SE, educators and instructional designers could strengthen TP by giving clear guidance, timely feedback, and promoting dialogue (He et al., 2023 ; Pakula, 2024 ; Su et al., 2023).

Cognitive presence and students' engagement

H2 : The students' cognitive presence (CP) will positively impact their engagement (SE) in the blended MOOC system.

H2 suggests that students' cognitive presence (CP) will boost their blended MOOC engagement (SE). Deep online learning relies on cognitive presence, the amount to which learners can generate meaning through persistent conversation (Garrison & Akyol, 2013 ). CP is crucial to meaningful learning in blended MOOCs, as Garrison et al. ( 2003 ) noted. This study confirms this.

The statistics for the relationship between CP and SE are β  =  0.194, t  =  7.140  >  1.96 for α  =  0.05, p  =  0.000, f 2  =  0.036, CI (0.141, 0.247 ).

Analysis shows a significant positive relationship between CP and SE, with a path coefficient of β = 0.194. This substantial outcome (p-value < 0.001) supports the idea that CP is critical in fostering student engagement in blended MOOCs. This study’s result demonstrates that CP significantly impacts SE, supporting Lee's ( 2014 ) findings that higher CP correlates with increasing student engagement in online debates. Although significant, the impact size ( f 2  = 0.036) is negligible, explaining only 3.6% of SE variance. This outcome suggests that while CP is undoubtedly important, its overall impact on SE may be influenced by other factors or presences within the blended MOOC context.

In the context of blended MOOC systems, cognitive presence is crucial as it reflects the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse (Garrison et al., 2001 ). A higher cognitive presence is associated with deeper levels of learning and understanding, which can lead to increased student engagement—a key indicator of successful learning outcomes (Chi, 2023 ). This assertion shows that deeper knowledge construction promotes engagement in the learning environment. This relationship may not be isolated. Blended MOOCs with collaborative tools and discussion forums can boost CP's impact on SE (Akyol & Garrison, 2013 ). CP affects Student Engagement (SE) together with teaching, social, and learning presence (Littler, 2024 ); Maranna et al., 2022 ). These interactions can result from a more holistic learning environment that fosters SE (Garrison et al., 2010 ). While our data show that CP directly affects SE, they also suggest that its position in the CoI framework has to be better understood. In future studies, these intricate interrelationships could inform blended learning instructional design and pedagogy. CP may mediate TP and SE, improving student satisfaction and learning (Shea et al., 2003 ). More research on how CP interacts with SP and LP may yield more insights.

Social presence and students' engagement

H3 : The students' social presence (SP) will positively influence their engagement (SE) in the blended MOOC system

The idea that social presence (SP) dramatically affects students' engagement (SE) in blended MOOCs yields intriguing results(Bergdahl et al., 2020 ; Ma et al., 2022 ). Online and blended learning settings depend on social presence—the feeling of belonging and mutual support—to build a sense of community.

The statistics for the relationship between SP and SE are β  =  0.082, t  =  3.595  >  1.96 for α  =  0.05, p  =  0.000, CI (0.039, 0.128), f 2  =  0.007) .

This study found a positive correlation between SP and SE, with a path coefficient of β = 0.082, showing that an increase in SP leads to an increase in SE. A significant connection (p-value < 0.001) supports that SP positively impacts student engagement in blended MOOCs. Although statistically significant, the effect size ( f 2  = 0.007) indicates a low practical influence, accounting for only 0.7% of SE variance. This negligible effect size (Hair et al., 2017 ) forces us to examine SP's complex role in student engagement and its connection with the Community of Inquiry (CoI) framework.

The statistical results support the theory that social presence (social interactions and a sense of belonging in online learning environments) increase student engagement (Garrison et al., 2000 ; Littler, 2024 ). The positive β coefficient indicates that social presence promotes student engagement, although the effect size ( f 2 ) is considered negligible by Cohen's ( 1988 ) standards (Hair et al., 2017 ). This negligible effect size raises questions regarding social presence's multifaceted significance in student engagement. Despite its negligible effect size, social presence creates a compelling learning environment (Littler, 2024 ). Furthermore, a strong social presence in online courses can make learning more inclusive and engaging, increasing satisfaction and retention (According to Garrison et al., 2000 ; Littler, 2024 ; Shea & Bidjerano, 2009 ).

SP is crucial to SE, yet it is only one of several factors as engagement is multidimensional, including behavioural, emotional, and cognitive aspects (Fredricks et al., 2004 ). SP may mediate CP and SE by improving cognitive engagement with course content in a supportive social environment (Shea et al., 2003 ). Gupta et al. ( 2024 ) highlight how social media and digital platforms may inform instructors about student participation and build a sense of belonging in virtual learning environments. Blended MOOCs with collaborative tools and discussion forums can boost SP's impact on SE (Akyol & Garrison, 2013 ; Gupta et al., 2024 ). Backgrounds and experiences can affect students' SP perception and participation (Almasi & Zhu, 2023). However, the overarching theme of individual differences in engagement and the role of digital platforms might imply the need to consider these factors. Educational practitioners and instructional designers could promote online discussions and group activities to boost social presence while also addressing other engagement factors.

Learning presence and students' engagement

H4 : The students' learning presence (LP) will positively impact their engagement (SE) in the blended MOOC system.

Learning presence(LP)–a composite of self-efficacy and other cognitive, behavioural and emotional characteristics that help online learners self- and co-regulate–significantly affects their engagement (SE) in blended MOOCs.

The statistics for the relationship between LP and SE are β  =  0.350, t  =  13.04  >  1.96 for α  =  0.05, p  =  0.000, f 2  =  0.136, CI(0.299, 0.403).

The association between LP and SE is substantially supported by empirical evidence, with a significant path coefficient of β = 0.350 (p < 0.001). With an  f 2  = 0.136, LP accounts for 13.6% of SE variation, making it the strongest predictor of CoI's presence in this study, though negligible. This result indicates that as students' learning presence increases, their engagement in the blended MOOC system also tends to increase (Angelaina & Jimoyiannis, 2012 ; Popescu & Badea, 2020 ; Richardson & Swan, 2003 ; Wicks et al., 2015 ). Learning presence (LP) is a multifaceted construct encompassing self-efficacy and cognitive, behavioural, and emotional characteristics that facilitate online learners' self- and co-regulation. This, in turn, significantly influences their engagement (SE) in blended MOOC environments. The empirical support for the association between LP and SE is substantial, indicating that LP accounts for approximately 13.6% of the variance in SE, thus emerging as the strongest predictor of the revised Community of Inquiry's (RCoI) presence within this study, albeit the effect size being small. This result shows that students' blended MOOC engagement increases with their LP. Studies show that LP is crucial to SE in blended MOOCs (Angelaina & Jimoyiannis, 2012 ; Popescu & Badea, 2020 ; Richardson & Swan, 2003 ; Wicks et al.,2015). The impact of LP on blended learning environments' instructional design and pedagogy goes beyond SE (Shea & Bidjerano, 2010 ). In blended learning, contextual factors like teaching presence and individual factors like self-regulated learning (SRL) and co-regulated learning (CoRL) affect student engagement (Liao et al., 2023 ). Academic self-efficacy (ASE) has also been shown to improve academic success in online learning (Wolverton et al., 2020 ). Self-efficacy affects students' motivation, learning strategies, and online learning engagement and achievement (Bedi, 2023 ; Saefudin & Yusoff, 2021 ; She et al., 2021 ). This study shows that LP is essential in hybrid MOOCs. Educators can boost student engagement and ensure blended learning success by creating an LP-friendly learning environment. By prioritising LP and its components, educators may create successful and engaging blended learning experiences for students. Such an impact shows how important LP is in blended MOOCs for fully engaged learning. The findings have significant implications for blended learning instructional design and instructor practices due to LP's significant impact on SE.

Previous studies have shown that the CoI framework is essential for increasing engagement in online and blended learning environments, which these results support (Garrison & Cleveland-Innes, 2005 ; Garrison et al., 2010 ; Shea et al., 2010 ; Veletsianos & Kimmons, 2012 ). The CoI framework stresses creating a supportive, interactive learning community where students feel involved and driven to participate (Garrison et al., 2010 ). The results of the hypotheses show that the four presences can make students more engaged and interested in blended MOOCs. Therefore, the instructional design of blended learning courses must consider these elements to create an environment conducive to student engagement and learning.

Coefficient of determination

The coefficient of determination (R 2 , R2 or R-square) is a statistical metric that quantifies the extent to which one or more independent variables can account for the variability in a dependent variable in a regression model. The coefficient of determination is bounded within the range of 0 and 1. A value of 0 signifies that the model does not account for any variation observed in the response variable relative to its mean. In contrast, a value of 1 indicates that the model accounts for all the variation observed in the response variable relative to its mean. The coefficient of determination holds significance as it aids in assessing the degree to which data aligns with a statistical model. The purpose of its usage is to determine the degree of fit between a regression model and the observed data. When the value of R 2 approaches 0, it indicates a lack of correlation between the independent and dependent variables. When the value of R 2 comes to 1, it means a significant correlation between the independent and dependent variables.

Falk and Miller ( 1992 ) proposed a minimum threshold of 0.10 for the coefficient of determination (R2). Cohen (1998) proposed R2 values to assess the strength of endogenous latent variables, categorising them as substantial (0.26), moderate (0.13), and weak (0.02). Additionally, Chin ( 1998 ) proposed that the R2 values for endogenous latent variables can be categorised as follows: 0.67 (indicating a substantial relationship), 0.33 (indicating a moderate relationship), and 0.19 (indicating a weak relationship). The R square score for the model is 0.568, which shows a moderately positive link between the four presences of CoI and students' engagement in blended MOOCs (Chin, 1998 ). This result indicates that when the four presences of CoI are present, students are more likely to be interested in the course. The result for R 2 , which is 0.568, indicates that the four presences of CoI explain 56.8% of the difference in how engaged students are in blended MOOCs. This result means that the four presences of CoI are essential factors that affect how engaged students are in blended MOOCs. The path coefficient of SP (0.082) suggests a weaker relationship than the other three presences, while the highest path coefficient is for LP (0.350), which offers a strong relationship. The result indicates that LP is the most influential presence among the four in promoting blended MOOCs. The high path coefficient for LP shows that having a supportive environment that makes it easy for learners to self-regulate their learning is critical if researchers want to get students involved in blended MOOCs. On the other hand, CP and TP can be grown by helping learners with their training and teaching.

Q-squared validation

Predictive relevance, denoted by Q 2 , Q-squared or Q2, is another relevant indicator of the significant paths in the model's validity. Predictive relevance scores of 0.02, 0.15, and 0.35 indicate low, medium, and high relevance, respectively, as stated by Hair et al., ( 2021a , 2021b , 2021c ). Thus, the Q 2 value greater than zero indicates that the model has predictive relevance for the corresponding endogenous construct, students’ engagement for this study. The present study employs the Q-squared (Q2) validation technique to ascertain the significance of paths within the proposed model. Table 11 shows that the Q2 value for social presence is 0.347, indicating that the model has predictive relevance for this construct. The Q 2 values for TP, CP, and LP are all 0, indicating that the model does not have predictive relevance for these constructs. This implies that the model is better at predicting social presence than it is at predicting teaching, cognitive, or social presence.

The interpretation of the Q 2 values I gave earlier suggests that the model is better at predicting learning presence than it is at predicting teaching presence, cognitive presence, or social presence. This means the model has more predictive relevance for Learning Presence than the other three constructs. In practical terms, this could mean that interventions aimed at improving learning presence may be more effective at increasing student engagement than interventions aimed at enhancing teaching presence, cognitive presence, or social presence. However, it is important to note that this is just one way to interpret the results and further analysis may be needed to draw more definitive conclusions.

Importance-performance map analysis

The Importance-Performance Map Analysis (IPMA) is a valuable tool for decision-making because it provides a clear and concise representation of complex data and helps decision-makers identify underperforming and overperforming factors. This result makes it easy to identify trade-offs between competing priorities. The IPMA that SmartPLS makes is a graph with two dimensions. The horizontal axis shows each factor's importance, and the vertical line shows how well each factor performs.

"Importance" is how respondents or stakeholders value a construct or factor. It shows how significant each construct is thought to be. Higher scores from 0 to 1 indicate that respondents value the construct (Ringle & Sarstedt, 2016 ).

Performance is how well each construct performs compared to what respondents expected. It shows how well or effectively each construct works. Lower ratings indicate from 100 to 0 that the construct is not working as well as planned(Ringle & Sarstedt, 2016 ).

On the line, each factor is shown by a dot. The dot size shows how often the respondent talked about that factor. The factor's importance is indicated by where the dot falls. To use the IPMA for decision-making, we can split the constructs into four quadrants based on their importance and performance. From Ringle and Sarstedt ( 2016 ), we interpret the four quadrants as indicated below:

The constructs in the upper-right quadrant are very important and perform highly to the respondents, meaning they work well and should be kept.

The constructs in the lower-right quadrant are important but have low success ratings, so they need improvement.

The constructs in the upper-left quadrant are not very important but have a high-performance rate, which means they are doing too well, and resources could be moved elsewhere. They are an example of possible overdoing.

The lower-left section has unimportant constructs that do not perform well, which means they can be moved around or thrown out.

When assigning priorities, the constructions that fall into the upper-right quadrant are seen as having a high priority, whilst the constructs that fall into the lower-left quadrant are regarded as having a low priority.

As described in Table  12 , the IPMA aims to identify direct and indirect exogenous variables within the model that exhibit strong performance or significant importance about the endogenous variable, which pertains to students' engagement(Hair Jr et al., 2021a; Ringle & Sarstedt, 2016 ). The performance measure is based on how well the direct variables have contributed to engagement using the available resources within a given time frame. The importance index quantifies how important each construct is in forecasting student engagement. The performance index is a metric that quantifies the average score of each construct.

The findings indicate that learning presence holds the highest importance, as evidenced by its index value of 0.225. It is closely followed by cognitive presence, which has an importance index of 0.125. Teaching presence ranks third in importance, with an index value of 0.07. Lastly, social presence demonstrates the lowest level of importance, as indicated by its importance index of 0.053. This finding implies that the primary determinant of student engagement is learning presence, with cognitive, teaching, and social presence as secondary factors. Concerning performance, each of the four constructs demonstrates relatively high-performance indices. Teaching presence exhibits the highest score (81.728), followed by cognitive presence (81.553), learning presence (81.126), and social presence (79.684). This observation implies that all four constructs exhibit satisfactory performance regarding their influence on student engagement.

Implications relating to the effects of students’ engagement within blended MOOCs

The findings suggest that the four presences of the revised Community of Inquiry (RCoI) framework, namely teaching, cognitive, social, and learning presence, significantly influence student engagement in blended MOOCs. Institutions offering blended MOOCs should focus on creating a learning environment that supports all four presence. Again, institutions offering blended MOOCs should focus on designing courses that promote these four presences. Similarly, institutions that provide blended MOOCs should put the most effort into making classes that help people grow the four presences.

The result shows that learning presence significantly affects students' engagement more than cognitive, teaching, and social presence in that order. Institutions that offer blended MOOCs should prioritise creating a good learning environment that encourages peer-to-peer learning, teamwork, idea sharing, and getting feedback from instructors and other students. The results also show that the four presences of RCoI explain 56.8% of the difference in student engagement levels in blended MOOCs, as demonstrated by the R square score of 0.568. This statement suggests that other things, such as students' attentiveness, the amount of academic difficulty, and the intellectual work they do, also affect how engaged they are (Ginting, 2021 ). As student engagement is a complex concept, academic institutions that offer blended MOOCs should broadly promote it.

Although statistical research forms a solid basis for comprehending these patterns, it is crucial to convert these findings into practical methods for educators, instructional designers, and policymakers who seek to optimise blended MOOCs to improve student engagement. Below are some specific ways to use these results in their blended MOOCs.

For educators and professionals in instructional design

Fostering teaching presence : To cultivate teaching presence, educators should prioritise cultivating effective communication channels, providing timely and constructive feedback, and establishing a nurturing online learning environment. Training programmes can be developed to improve instructors' digital pedagogical abilities, ensuring they are well-prepared to engage students in blended MOOCs effectively.

Enhancing cognitive presence:  To enhance cognitive presence, instructional designers should integrate activities that foster critical thinking and facilitate knowledge creation. These activities may include case studies, problem-solving tasks, and assignments that undergo peer review. Course structures can promote active discourse and introspection, fostering a more profound comprehension and involvement with the subject matter.

Developing social presence : Facilitating student engagement through platforms such as discussion boards, collaborative projects, and peer feedback sessions can foster a strong feeling of community inside blended MOOCs. By integrating synchronous components, such as real-time webinars or virtual office hours, the immediacy of social interactions can be further intensified.

Supporting learning presence:  Embedding strategies to cultivate students' self-regulation and self-efficacy inside the course design is crucial for fostering learning presence. These may encompass activities for setting goals, quizzes for self-assessment, and materials on efficient study techniques. Promoting student autonomy in their learning process is crucial for cultivating a strong learning presence.

For policymakers

Resource allocation:  Policymakers should prioritise allocating resources towards developing blended MOOCs that facilitate interactive and captivating learning experiences. This strategy encompasses allocating resources towards digital infrastructure, providing opportunities for professional growth among educators, and researching the most effective online teaching and learning methods.

Accessibility and inclusion:  Ensuring the accessibility and inclusion of blended MOOCs for a diverse student body is of utmost importance. Policy measures could be established to tackle obstacles to access. Educational institutions should explore cheaper alternative ways for internet connectivity, e.g. providing zero-rated access to MOOCs and negotiating better student internet deals with internet service providers (ISPs). Nations in the global south should consider adopting United States programmes that provide discounted internet access for educational purposes, such as the Affordable Connectivity Program (ACP)and the Federal Communications Commission's E-Rate programme. Furthermore, consideration should be given to creating culturally and linguistically inclusive material and locally relevant content. Lastly, improving the accessibility of MOOCs is also about making the courses themselves more accessible with the accessibility features required by Web Content Accessibility Guidelines (WCAG), like integrating captions, image descriptions, and various forms of learning into the course curriculum to meet many learners' needs like those with disabilities.

Quality assurance:  Establishing standards for online course quality, which includes defining criteria for instructional presence, cognitive engagement, and social interaction, can guide the creation of high-quality blended MOOCs. It is necessary to develop continuous evaluation and improvement processes to sustain the effectiveness of online learning environments.

By taking into account these wider educational consequences, the results of our study can guide the creation of blended MOOCs that are both intellectually demanding and highly captivating while being easily accessible to a diverse group of learners. The objective is to utilise the distinctive capabilities of online learning to offer significant and all-encompassing educational experiences that cater to the requirements of the current varied student population.

The study contributes to the discussion on blended MOOCs by exploring students' engagement. By including the notion of learning presence in the revised Community of Inquiry (RCoI) framework, we have highlighted its crucial significance in enhancing the learning experience, accounting for a 56.8% variation in student engagement. Our analysis highlights the importance of learning presence, demonstrating its more significant influence on student engagement than the conventional triad of cognitive, social, and teaching presences. This discovery supports the idea of a fundamental change in perspective towards recognising learning presence as a central aspect of the CoI framework, strengthening its crucial function in promoting a comprehensive and engaging online learning environment.

The empirical results from this study emphasise the necessity for educators and instructional designers to create learning environments that facilitate individual learning, promote meaningful engagement, and offer well-organised pedagogical support (Garrison et al., 2000 ) Strategies such as collaborative projects, peer reviews, and interactive multimedia are crucial instruments for effectively engaging students. When applied successfully, these strategies can enhance student satisfaction and performance in blended MOOCs, thus enabling more personalised and powerful learning experiences. We support the idea of including learning presence within the CoI framework by other scholars (Anderson, 2017 ; Shea & Bidjerano, 2010 ). By adopting this integrated perspective, educators, instructional designers, and policymakers are equipped to navigate the complexities of online education, creating experiences that align with learners' varied needs and goals (Fink, 2013 ). Exploring the RCoI inside blended MOOCs is an academic pursuit and a collective dedication to revolutionising online education into a realm where any student, irrespective of their background, can flourish and attain their maximum capabilities (Tang, 2018 ).

While each presence affects engagement, their interaction implies a complex ecosystem worth further studying. Future research should examine the synergistic impacts of these presences across educational environments to inform online and blended learning experiences. Furthermore, future studies should investigate each presence's mediating or moderating roles concerning student engagement.

Availability of data and materials

Though the paper is part of an on-going PhD dissertation, the datasets are available from the corresponding author on reasonable request.

Afthanorhan, W. M. A. B. W. (2013). A comparison of partial least square structural equation modeling (PLS-SEM) and covariance based structural equation modeling (CB-SEM) for confirmatory factor analysis. International Journal of Engineering Science and Innovative Technology, 2 (5), 198–205.

Google Scholar  

Agarwal, R. K. (2021). MOOCS: Challenges & prospects in Indian higher education. In R. Chheda & S. N. Mehta (Eds.), Management practices in digital world. London: Empyreal Publishing House.

Ahmad, S., & Hussain, A. (2019). Authentication of psychosomatic capability and workplace life of teachers scales by structural equation modeling. Journal of Educational Research, 22 (2), 68–81.

Akyol, Z., & Garrison, D. R. (2013). Educational communities of inquiry: Theoretical framework, research and practice (pp. 1–347). https://doi.org/10.4018/978-1-4666-2110-7 .

Alabbasi, D. (2022, April). Factors influencing student engagement in virtual classrooms and their impact on satisfaction. In  Society for information technology & teacher education international conference  (pp. 142–151). Association for the Advancement of Computing in Education (AACE).

Alkhalaf, S., & Nguyen, T. (2020). Exploring the factors influencing the adoption of blended learning at higher education institutions: A study of instructors’ perspectives. Education and Information Technologies, 25 (2), 1157–1178. https://doi.org/10.1007/s10639-019-10022-5

Article   Google Scholar  

Almutairi, F., & White, S. (2018). How to measure student engagement in the context of blended-MOOC. Interactive Technology and Smart Education, 15 (3), 262–278.

Anderson, T. (2017). How communities of inquiry drive teaching and learning in the digital age. North Contact , 1–16.

Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5 (2), 1–17.

Andrade, H. L., Brookhart, S. M., & Yu, E. C. (2021, December). Classroom assessment as co-regulated learning: A systematic review. In  Frontiers in education  (Vol. 6, p. 751168). Frontiers.

Angelaina, S., & Jimoyiannis, A. (2012). Analysing students’ engagement and learning presence in an educational blog community. Educational Media International, 49 (3), 183–200.

Arbaugh, J. B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Networks, 11 (1), 73–85.

Azila-Gbettor, E. M., Mensah, C., Abiemo, M. K., & Bokor, M. (2021). Predicting student engagement from self-efficacy and autonomous motivation: A cross-sectional study. Cogent Education, 8 (1), 1942638.

Bedi, A. (2023). Keep learning: Student engagement in an online environment. Online Learning, 27 (2), 119–136.

Bergdahl, N., Nouri, J., & Fors, U. (2020). Disengagement, engagement and digital skills in technology-enhanced learning. Education and Information Technologies, 25 , 957–983.

Bradley, R. L., Browne, B. L., & Kelley, H. M. (2017). Examining the influence of self-efficacy and self-regulation in online learning. College Student Journal, 51 (4), 518–530.

Bruff, D. O., Fisher, D. H., McEwen, K. E., & Smith, B. E. (2013). Wrapping a MOOC: Student perceptions of an experiment in blended learning. Journal of Online Learning and Teaching, 9 (2), 187–199.

Caskurlu, S., Maeda, Y., Richardson, J. C., & Lv, J. (2020). A meta-analysis addressing the relationship between teaching presence and students’ satisfaction and learning. Computers & Education, 157 , 103966.

Chi, X. (2023). The influence of presence types on learning engagement in a MOOC: The role of autonomous motivation and grit.  Psychology Research and Behavior Management , 5169–5181.

Chin, W. W. (1998). The partial least squares approach to structural equation modeling. Modern Methods for Business Research, 295 (2), 295–336.

Cho, M. H., & Shen, D. (2013). Self-regulation in online learning. Distance Education, 34 (3), 290–301.

Choo, J., Bakir, N., Scagnoli, N. I., Ju, B., & Tong, X. (2020). Using the Community of Inquiry framework to understand students’ learning experience in online undergraduate business courses. TechTrends, 64 , 172–181.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Erlbaum Associates.

Contact North (March 2016). Five ways MOOCs are influencing teaching and learning. Ontarios Distance Education and Training Network . March 2016 contactnorth.ca

Cui, X., Qian, J., Garshasbi, S., Zhang, S., Sun, G., Wang, J., et al. (2024). Enhancing learning effectiveness in livestream teaching: Investigating the impact of teaching, social, and cognitive presences through a community of inquiry lens. STEM Education, 4 (2), 82–105.

Damm, C. A. (2016). Applying a community of inquiry instrument to measure student engagement in large online courses. Current Issues in Emerging eLearning, 3 (1), 9.

Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11 (4), 227–268.

Edumadze, J. K. E., Otchere Darko, S., Mensah, S., Bentil, D., & Edumadze, G. E. (2022). SWOT Analysis of blended MOOC from ghanaian university instructors’ perspectives. Shanlax International Journal of Arts, Science and Humanities, 10 (1), 67–79. https://doi.org/10.34293/sijash.v10i1.4793.

De Freitas, S. I., Morgan, J., & Gibson, D. (2015). Will MOOCs transform learning and teaching in higher education? Engagement and course retention in online learning provision. British Journal of Educational Technology, 46 (3), 455–471.

Diamantopoulos, A., Sarstedt, M., Fuchs, C., Wilczynski, P., & Kaiser, S. (2012). Guidelines for choosing between multi-item and single-item scales for construct measurement: A predictive validity perspective. Journal of the Academy of Marketing Science, 40 , 434–449.

Dijkstra, T. K., & Henseler, J. (2015). Consistent partial least squares path modeling. MIS Quarterly, 39 (2), 297–316.

Dixson, M. (2015a). Measuring student engagement in the online course: The online student engagement scale (OSE). Online Learning, 19 (4), 143–158.

Dixson, M. D. (2015b). Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learning, 19 (4), n4.

Doo, M. Y., & Bonk, C. J. (2020). The effects of self-efficacy, self-regulation and social presence on learning engagement in a large university class using flipped Learning. Journal of Computer Assisted Learning, 36 (6), 997–1010.

Falk, R. F., & Miller, N. B. (1992). A primer for soft modeling . University of Akron Press.

Feldman, K. (2018, November 7). Variance Inflation Factor (VIF) . Isixsigma. https://www.isixsigma.com/dictionary/variance-inflation-factor-vif/

Fink, L. D. (2013). Creating significant learning experiences: An integrated approach to designing college courses . Wiley.

Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18 (1), 39–50.

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74 (1), 59–109.

Garrison, D. R. (2015). Thinking collaboratively: Learning in a community of inquiry . Routledge.

Book   Google Scholar  

Garrison, D. R. (2017). E-Learning in the 21st century: A community of inquiry framework for research and practice (3rd ed.). Routledge/Taylor and Francis.

Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. The American Journal of Distance Education, 19 (3), 133–148.

Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines . John Wiley & Sons.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical Inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2 (2), 87–105.

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15 (1), 7–23.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education, 1 (4), 113–127.

Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13 (1–2), 5–9.

Garrison, D. R. (2022, August 8). Shared metacognition and regulation response . The Community of Inquiry: Editorials

Garrison, D.R., & Akyol, Z. (2013). The Community of Inquiry Theoretical Framework. In Handbook of distance education (pp. 122–138). Routledge.

Ginting, D. (2021). Student engagement and factors affecting active learning in English language teaching. VELES (Voices of English Language Education Society), 5 (2), 215–228.

Gray, J., & Diloreto, M. (2016). The effects of student engagement, student satisfaction, and perceived learning in online learning environments. NCPEA International Journal of Educational Leadership Preparation, 11 (1), 98–119.

Groves, R. (2012, September 21). Georgetown University . Our moment in time: https://blog.provost.georgetown.edu/our-moment-in-time/

Gupta, D., Khan, A. A., Kumar, A., Baghel, M. S., & Tiwar, A. (2024). Socially connected learning harnessing digital platforms for educational engagement. In Navigating innovative technologies and intelligent systems in modern education (pp. 210–228). IGI Global.

Hair, J. F., Jr., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2021a). A primer on partial least squares structural equation modeling (PLS-SEM) . Sage publications.

Hair, J. F., Jr., Hult, G. T. M., Ringle, C. M., Sarstedt, M., Danks, N. P., & Ray, S. (2021b). Partial Least Squares Structural Equation Modeling (PLS-SEM) using R, classroom companion: Business . Springer Nature. https://doi.org/10.1007/978-3-030-80519-7_1

Hair, J. F., Jr., Matthews, L. M., Matthews, R. L., & Sarstedt, M. (2017). PLS-SEM or CB-SEM: Updated guidelines on which method to use. International Journal of Multivariate Data Analysis, 1 (2), 107–123.

Hair, J., Jr., Hair, J. F., Jr., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. (2021c). A primer on partial least squares structural equation modelling (PLS-SEM) . Sage publications.

Hair, J. F., Risher, J. J., Sarstedt, M., & Ringle, C. M. (2019). When to use and how to report the results of PLS-SEM. European Business Review, 31 (1), 2–24.

Haw, L. H., Sharif, S. B., & Han, C. G. K. (2022). Predictors of student engagement in science learning: The role of science laboratory learning environment and science learning motivation. ASIA Pacific Journal of Educators and Education . https://doi.org/10.21315/apjee2022.37.2.1

He, J., Liu, Z., & Kong, X. (2023, September). A novel link prediction approach for MOOC forum thread recommendation using personalized pagerank and machine learning. In  2023 3rd international conference on educational technology (ICET)  (pp. 37–41). IEEE.

Helland, I. S., Sæbø, S., Almøy, T., & Rimal, R. (2018). Model and estimators for partial least squares regression. Journal of Chemometrics , 32 (9), e3044.

Henseler, J., Ringle, C. M., & Sarstedt, M. (2015). A new criterion for assessing discriminant validity in variance-based structural equation modeling. Journal of the Academy of Marketing Science, 43 , 115–135.

Hodges, T. (2018, October 25). School engagement is more than just talk . Gallup https://www.gallup.com/education/244022/school-engagement-talk.aspx?version=print

Holotescu, C., Grosseck, G., Crețu, V., & Naaji, A. (2014). Integrating MOOCs in blended courses. In 10th international scientific conference eLearning and software for education. Bucharest.

Hulland, J. (1999). Use of partial least squares (PLS) in strategic management research: A review of four recent studies. Strategic Management Journal, 20 (2), 195–204.

Hussain, M., Zhu, W., Zhang, W., & Abidi, S. M. R. (2018). Student engagement predictions in an e-learning system and their impact on student course assessment scores. In  Computational intelligence and neuroscience ,  2018 .

Irene, B. N. O. (2019). Technopreneurship: a discursive analysis of the impact of technology on the success of women entrepreneurs in South Africa. Digital Entrepreneurship in Sub-Saharan Africa: Challenges, Opportunities and Prospects (pp. 147–173).

Ituma, A. (2011). An evaluation of students’ perceptions and engagement with e-learning components in a campus-based university. Active Learning in Higher Education, 12 (1), 57–68.

Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learning analytics to unveil learning strategies in a flipped classroom. The Internet and Higher Education, 33 , 74–85.

Jimoyiannis, A., & Tsiotakis, P. (2017). Beyond students’ perceptions: Investigating learning presence in an educational blogging community. Journal of Applied Research in Higher Education., 9 (1), 129–146.

Joksimović, S., Poquet, O., Kovanović, V., Dowell, N., Mills, C., Gašević, D., Dawson, S., Graesser, A. C., & Brooks, C. (2018). How do we model learning at scale? A systematic review of research on MOOCs. Review of Educational Research, 88 (1), 43–86.

Jöreskog, K. G., & Sörbom, D. (1995). LISREL 8: Structural equation modeling with the SIMPLIS command language . Scientific Software International.

Jung, Y., & Lee, J. (2018). Learning engagement and persistence in massive open online courses (MOOCS). Computers & Education, 122 , 9–22.

Kang, M. H., Park, J., U., & Shin, S. Y. (2007). Developing a cognitive presence scale for measuring students' involvement during e-learning process. In C. Montgomerie & J. Seale (Eds.), Proceedings of world conference on educational multimedia, hypermedia and telecommunications 2007 (pp. 2823–2828). Association for the Advancement of Computing in Education (AACE).

Kang, M., Park, J. U., & Shin, S. (2007, June). Developing a cognitive presence scale for measuring students' involvement during e-learning process. In  EdMedia+ innovate learning  (pp. 2823–2828). Association for the Advancement of Computing in Education (AACE).

King, R. B. (2015). Sense of relatedness boosts engagement, performance, and well-being: A latent growth model study. Contemporary Educational Psychology, 42 , 26–38.

Kloos, C. D., Muñoz-Merino, P. J., Alario-Hoyos, C., Ayres, I. E., & Fernández-Panadero, C. (2015). Mixing and blending MOOC technologies with face-to-face pedagogies. In  Proceedings of the IEEE global engineering education conference (EDUCON) , Tallin, Estonia (pp. 967–971).

Kock, N. (2015). Common method bias in PLS-SEM: A full collinearity assessment approach. International Journal of e-Collaboration (ijec), 11 (4), 1–10.

Koller, D., Ng, A., Do, C., & Chen, Z. (2013). Retention and intention in massive open online courses: In depth. Educause Review, 48 (3), 62–63.

Koutsakas, P., Karagiannidis, C., Politis, P., & Karasavvidis, I. (2020). A computer programming hybrid MOOC for Greek secondary education. Smart Learning Environments, 7 , 7.

Kozan, K., & Caskurlu, S. (2018). On the Nth presence for the Community of Inquiry framework. Computers & Education, 122 , 104–118.

Kruse, A., & Pongratz, H. (2017). Digital change: How MOOCs transform the educational landscape. In H. Ellermann, P. Kreutter, & W. Messner (Eds.), The Palgrave handbook of managing continuous business transformation (pp. 353–373). Springer.

Chapter   Google Scholar  

Lambert, J. L., & Fisher, J. L. (2013). Community of inquiry framework: Establishing community in an online course. Journal of Interactive Online Learning, 12 (1), 1–16.

Lee, S. M. (2014). The relationships between higher order thinking skills, cognitive density, and social presence in online learning. The Internet and Higher Education, 21 , 41–52.

Liao, H., Zhang, Q., Yang, L., & Fei, Y. (2023). Investigating relationships among regulated learning, teaching presence and student engagement in blended learning: An experience sampling analysis. Education and Information Technologies, 28 (10), 12997–13025.

Lin, K. Y., Wu, Y. T., Hsu, Y. T., & Williams, P. J. (2021). Effects of infusing the engineering design process into STEM project-based learning to develop preservice technology teachers’ engineering design thinking. International Journal of STEM Education, 8 (1), 1–15. https://doi.org/10.1186/s40594-020-00258-9

Lindner, T., Puck, J., & Verbeke, A. (2020). Misconceptions about multicollinearity in international business research: Identification, consequences, and remedies. Journal of International Business Studies, 51 (3), 283–298.

Linnenbrink, E. A., & Pintrich, P. R. (2003a). The role of self-efficacy beliefs instudent engagement and learning intheclassroom. Reading & Writing Quarterly, 19 (2), 119–137.

Linnenbrink, E. A., & Pintrich, P. R. (2003b). The role of self-efficacy beliefs in student engagement and learning in the classroom. Reading & Writing Quarterly, 19 (2), 119–137.

Littler, M. (2024). Social, Cognitive, and Teaching Presence as Predictors of Online Student Engagement Among MSN Students [Ph.D. thesis, Walden University]. Walden Dissertations and Doctoral Studies Collection.

Ma, Y., Zuo, M., Yan, Y., Wang, K., & Luo, H. (2022). How do K-12 students’ perceptions of online learning environments affect their online learning engagement? Evidence from China’s COVID-19 school closure period. Sustainability, 14 (23), 15691.

Maphosa, V., & Maphosa, M. (2023). Opportunities and challenges of adopting MOOCs in Africa: A systematic literature review. In S. Goundar (Ed.). Massive open online courses-current practice and future trends . IntechOpen. https://doi.org/10.5772/intechopen.1001518 .

Maranna, S., Willison, J., Joksimovic, S., Parange, N., & Costabile, M. (2022). Factors that influence cognitive presence: A scoping review. Australasian Journal of Educational Technology, 38 (4), 95–111.

McNutt, M. (2013). Bricks and MOOCs. Science, 342 (6157), 402.

Miao, J., & Ma, L. (2022). Students’ online interaction, self-regulation, and learning engagement in higher education: The importance of social presence to online learning. Frontiers in Psychology, 13 , 815220.

Montgomery, A. P., Hayward, D. V., Dunn, W., Carbonaro, M., & Amrhein, C. G. (2015). Blending for student engagement: Lessons learned for MOOCs and beyond. Australasian Journal of Educational Technology, 31 (6), 657.

Moore, R. L., & Miller, C. N. (2022). Fostering cognitive presence in online courses: A systematic review (2008–2020). Online Learning, 26 (1), 130–149.

Netemeyer, R. G., Bearden, W. O., & Sharma, S. (2003). Scaling procedures . SAGE, London: Issues and applications.

Nikolopoulou, K. (2022, September 2). What is discriminant validity? Definition & example. Scribbr. https://www.scribbr.co.uk/research-methods/discriminant-validity-explained/

Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). McGraw-Hill.

Onah, D. F., Pang, E. L., & Sinclair, J. E. (2022). An investigation of self-regulated learning in a novel MOOC platform.  Journal of Computing in Higher Education , 1–34.

OpenupEd. (2015). Definition massive open online courses. Heerlen: EADTU. http://www.openuped.eu/images/docs/Definition_Massive_Open_Online_Courses.pdf

Oyarzun, B., & Morrison, G. (2013). Cooperative learning effects on achievement and community of inquiry in online education. The Quarterly Review of Distance Education, 14 (4), 181–194.

Pakula, A. (2024). The role of tutor in massive social language learning: A case study of an academic Italian MOOC. In 18th international technology, education and development conference (pp. 2195–2203). Valencia, Spain. https://doi.org/10.21125/inted.2024.0603 .

Popescu, E., & Badea, G. (2020). Exploring a community of inquiry supported by a social media-based learning environment. Educational Technology & Society, 23 (2), 61–76.

Prasetyo, A., Tamrin, A. G., & Estriyanto, Y. (2022). A successful model of microsoft teams online learning platform in vocational high school. FWU Journal of Social Sciences, 16 (2).

Qaffas, A. A., Kaabi, K., Shadiev, R., & Essalmi, F. (2020). (2020) Towards an optimal personalization strategy in MOOCs. Smart Learning Environments, 7 , 14. https://doi.org/10.1186/s40561-020-0117-y

Rahimi, A. R. (2024). A tri-phenomenon perspective to mitigate MOOCs’ high dropout rates: the role of technical, pedagogical, and contextual factors on language learners’ L2 motivational selves, and learning approaches to MOOC. Smart Learning Environments . https://doi.org/10.1186/s40561-024-00297-7

Ranjan, P. (2020). Exploring the Models of Designing Blended & Online Learning Courses for Adoption in Regular Teacher Education Course. In Voices of teachers and teacher educators IX . National Council of Educational Research and Training (NCERT).

Ravinder, E. B., & Saraswathi, A. B. (2020). Literature Review Of Cronbach alpha coefficient (Α) And Mcdonald’s Omega Coefficient (Ω). European Journal of Molecular & Clinical Medicine, 7 (6), 2943–2949.

Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceive learning and satisfaction. Journal of Asynchronous Learning Networks, 7 (1), 68–88.

Ringle, C. M., & Sarstedt, M. (2016). Gain more insight from your PLS-SEM results: The importance-performance map analysis. Industrial Management & Data Systems, 116 (9), 1865–1886.

Ringle, C. M., Wende, S., & Becker, J. M. (2022). SmartPLS 4. Oststeinbek: SmartPLS GmbH.  Journal of Applied Structural Equation Modeling.

Saefudin, W., & Yusoff, S. H. M. (2021). Self-efficacy and student engagement in online learning during pandemic. Global Journal of Educational Research and Management, 1 (4), 219–231.

Sarstedt, M., Ringle, C. M., & Hair, J. F. (2021). Partial least squares structural equation modeling. In Handbook of market research (pp. 587–632). Springer International Publishing.

Saunders, M. (2014). Research methods for business students (6th edn, Greek language edition) . Pearson Education.

Schunk, D. H., & Mullen, C. A. (2012). Self-efficacy as an engaged learner. In  Handbook of research on student engagement  (pp. 219–235). Springer US.

She, L., Ma, L., Jan, A., Sharif Nia, H., & Rahmatpour, P. (2021). Online learning satisfaction during COVID-19 pandemic among Chinese university students: The serial mediation model. Frontiers in Psychology, 12 , 743936.

Shea, P. (2010). Online learning presence. In Proceeding of the European Distance and e-learning network (EDEN) annual conference . Valencia, Spain.

Shea, P., & Bidjerano, T. (2009). Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education. Computers & Education , 52.

Shea, P., & Bidjerano, T. (2010). Learning presence: Towards a theory of self-efficacy, self-regulation, and the development of a community of inquiry in online and blended learning environments. Computers & Education, 55 (4), 1721–1731.

Shea, P., & Bidjerano, T. (2012). Learning presence as a moderator in the community of inquiry model. Computers & Education, 59 (2), 316–326.

Shea, P., Fredericksen, E. E., Pickett, A. M., & Pelz, W. (2003). Student satisfaction and Reported learning in the SUNY Learning Network. In T. Duffy & J. Kirkley (Eds.), Learner-centred theory and practice in distance education. Lawrence Erlbaum.

Shea, P., Hayes, S., & Vickers, J. (2010). Online instructional effort measured through the lens of teaching presence in the community of inquiry framework: A re-examination of measures and approach. International Review of Research in Open and Distributed Learning, 11 (3), 127–154.

Shivdas, A., Menon, D. G., & Nair, C. S. (2020). Antecedents of acceptance and use of a digital library system: Experience from a Tier 3 Indian city. The Electronic Library, 38 (1), 170–185.

Sukor, R., Ayub, A. F. M., Ab, N. K. M. A. R., & Halim, F. A. (2021). Relationship between students’ engagement with academic performance among non-food science students enrolled in food science course. Journal of Turkish Science Education, 18 (4), 638–648.

Tang, H. (2018). Exploring self-regulated learner profiles in MOOCs: A comparative study . The Pennsylvania State University.

Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. International Journal of Medical Education, 2 , 53.

TED. (2014, January 27). Anant Agarwal: Why massively open online courses (still) matter [Video]. YouTube. https://www.youtube.com/watch?v=rYwTA5RA9eU

Ulrich, C., & Nedelcu, A. (2015). Moocs in our university: Hopes and worries. Procedia-Social and Behavioral Sciences, 180 , 1541–1547.

UNICEF Office of Innovation. (2022, May 10). Can tech solve the global education crisis? UNICEF. https://www.unicef.org/innovation/xtc-unicef-edtech-award-finalists

Veletsianos, G., & Kimmons, R. (2012). Networked participatory scholarship: Emergent techno-cultural pressures toward open and digital scholarship in online networks. Computers & Education, 58 (2), 766–774.

Vezne, R., Yildiz Durak, H., & Atman Uslu, N. (2023). Online learning in higher education: Examining the predictors of students’ online engagement. Education and Information Technologies, 28 (2), 1865–1889.

Vieira,D. Mutize, T. & Jaime Roser Chinchilla, J.B.(2020, December 21). Understanding access to higher education in the last two decades . UNESCO. https://www.iesalc.unesco.org/en/2020/12/23/understanding-access-to-higher-education-in-the-last-two-decades/

Walters, H. (2014, January 27). We need to change everything on campus . Ideas. Ted. Com. https://ideas.ted.com/we-need-to-change-everything-on-campus-anant-agarwal-of-edx-on-moocs-mit-and-new-models-of-higher-education/

Wang, M. T., & Degol, J. (2014). Staying engaged: Knowledge and research needs in student engagement. Child Development Perspectives, 8 (3), 137–143.

Wang, X. Yang, D., Wen, M., Koedinger, K., & Rosé, C. P. (2015). Investigating how student’s cognitive behavior in MOOC discussion forums affect learning gains . In Proceedings of the 8th international conference on educational data mining (EDM 2015) , June 26–29, 2015 (pp. 226–233). International Educational Data Mining Society (IEDMS). http://www.educationaldatamining.org/EDM2015/uploads/papers/paper_89.pdf

Wertz, R. E. (2022). Learning presence within the Community of Inquiry framework: An alternative measurement survey for a four-factor model. The Internet and Higher Education, 52 , 100832.

Whitehill J., Williams J. J., Lopez G., Coleman C. A., Reich J. (2015). Beyond prediction: First steps toward automatic intervention in MOOC student stopout. In Proceedings of the 8th international conference on educational data mining (EDM’15) , June 26–29, 2015 (pp. 171–178). International Educational Data Mining Society (IEDMS). http://www.educationaldatamining.org/EDM2015/uploads/papers/paper_112.pdf

Wicks, D., Craft, B. B., Lee, D., Lumpe, A., Henrikson, R., Baliram, N., Bian, X., Mehlberg, S., & Wicks, K. (2015). An evaluation of low versus high collaboration in online learning. Online Learning, 19 (4), n4.

Wolverton, C. C., Hollier, B. N. G., & Lanier, P. A. (2020). The impact of computer self efficacy on student engagement and group satisfaction in online business courses. Electronic Journal of e-Learning, 18 (2), 175–188.

Wu, M. J., Zhao, K., & Fils-Aime, F. (2022). Response rates of online surveys in published research: A meta-analysis. Computers in Human Behavior Reports, 7 , 100206

Yeboah, D. (2020). Predicting acceptance of WhatsApp as learning-support tool by higher distance education students in Ghana . Unpublished Ph.D. Thesis. Texila American University, Guyana.

Yeboah, D., & Nyagorme, P. (2020). Validation of non-linear relationships-based UTAUT model on higher distance education students’ acceptance of Whatsapp for supporting learning. Texila International Journal of Academic Research, 7 (2), 27–39. https://doi.org/10.21522/TIJAR.2014.07.02.Art004

Yousef, A. M., Chatti, M., Schroeder, U., & Wosnitza, M. (2015). A usability evaluation of a blended MOOC environment: An experimental case study. International Review of Research in Open and Distributed Learning, 16 (2), 69–93.

Yunusa, A. A., Umar, I. N., & Bervell, B. (2021). Massive open online courses (MOOCs) in Sub-Saharan African Higher Education Landscape: A Bibliometric Review.  MOOC (Massive Open Online Courses) , 1–25.

Yusof, A., Atan, N. A., Harun, J., & Doulatabadi, M. (2017). Understanding learners’ persistence and engagement in Massive Open Online Courses: A critical review for Universiti Teknologi Malaysia. Man in India, 97 (12), 147–157.

Zakaria, M., Awang, S., & Rahman, R. A. (2019). Are MOOCs in blended learning more effective than traditional classrooms for undergraduate learners. Universal Journal of Educational Research, 7 (11), 2417–2424.

Zhang, H., Lin, L., Zhan, Y., & Ren, Y. (2016). The impact of teaching presence on online engagement behaviors. Journal of Educational Computing Research, 54 (7), 887–900.

Zhang, Y. (2013). Benefiting from MOOC. World conference on educational multimedia, Hypermedia and Telecommunications (pp. 1372–1377).

Zhang, Y. (2022). The effect of educational technology on EFL learners’ self-efficacy. Frontiers in Psychology, 13 , 881301.

Zimmerman, B. J. (1986). Becoming a self-regulated learner: Which are the key subprocesses? Contemporary Educational Psychology, 11 (4), 307–313.

Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81 (3), 329.

Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In  Handbook of self-regulation  (pp. 13–39). Academic Press.

Download references

Acknowledgements

I express my appreciation to (1) Dr. Fadiyah Almutairi for the question items on the Blended MOOCs Engagement Model and (2) Prof Ruth E. H. Wertz for the question items on the community of inquiry. These items were used as part of the survey instrument for the study.

Not available.

Author information

Authors and affiliations.

Network and Infrastructure Services, University of Cape Coast, Cape Coast, Ghana

John Kwame Eduafo Edumadze

Department of Mathematics and Computer Science Education, University of KwaZulu-Natal, Pinetown, South Africa

John Kwame Eduafo Edumadze & Desmond Welsey Govender

You can also search for this author in PubMed   Google Scholar

Contributions

JKEE wrote the PhD thesis from which is paper was extracted. DWG supervised the thesis and approved the final manuscript.

Corresponding author

Correspondence to John Kwame Eduafo Edumadze .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Edumadze, J.K.E., Govender, D.W. The community of inquiry as a tool for measuring student engagement in blended massive open online courses (MOOCs): a case study of university students in a developing country. Smart Learn. Environ. 11 , 19 (2024). https://doi.org/10.1186/s40561-024-00306-9

Download citation

Received : 24 August 2023

Accepted : 28 April 2024

Published : 14 May 2024

DOI : https://doi.org/10.1186/s40561-024-00306-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Community of inquiry
  • Blended MOOCs
  • MOOCs as open educational resources
  • Structural equation modelling
  • Sub-Saharan Africa

case study about student engagement

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Microbiol Biol Educ
  • v.16(1); 2015 May

Case Study Teaching Method Improves Student Performance and Perceptions of Learning Gains †

Associated data.

  • Appendix 1: Example assessment questions used to assess the effectiveness of case studies at promoting learning
  • Appendix 2: Student learning gains were assessed using a modified version of the SALG course evaluation tool

Following years of widespread use in business and medical education, the case study teaching method is becoming an increasingly common teaching strategy in science education. However, the current body of research provides limited evidence that the use of published case studies effectively promotes the fulfillment of specific learning objectives integral to many biology courses. This study tested the hypothesis that case studies are more effective than classroom discussions and textbook reading at promoting learning of key biological concepts, development of written and oral communication skills, and comprehension of the relevance of biological concepts to everyday life. This study also tested the hypothesis that case studies produced by the instructor of a course are more effective at promoting learning than those produced by unaffiliated instructors. Additionally, performance on quantitative learning assessments and student perceptions of learning gains were analyzed to determine whether reported perceptions of learning gains accurately reflect academic performance. The results reported here suggest that case studies, regardless of the source, are significantly more effective than other methods of content delivery at increasing performance on examination questions related to chemical bonds, osmosis and diffusion, mitosis and meiosis, and DNA structure and replication. This finding was positively correlated to increased student perceptions of learning gains associated with oral and written communication skills and the ability to recognize connections between biological concepts and other aspects of life. Based on these findings, case studies should be considered as a preferred method for teaching about a variety of concepts in science courses.

INTRODUCTION

The case study teaching method is a highly adaptable style of teaching that involves problem-based learning and promotes the development of analytical skills ( 8 ). By presenting content in the format of a narrative accompanied by questions and activities that promote group discussion and solving of complex problems, case studies facilitate development of the higher levels of Bloom’s taxonomy of cognitive learning; moving beyond recall of knowledge to analysis, evaluation, and application ( 1 , 9 ). Similarly, case studies facilitate interdisciplinary learning and can be used to highlight connections between specific academic topics and real-world societal issues and applications ( 3 , 9 ). This has been reported to increase student motivation to participate in class activities, which promotes learning and increases performance on assessments ( 7 , 16 , 19 , 23 ). For these reasons, case-based teaching has been widely used in business and medical education for many years ( 4 , 11 , 12 , 14 ). Although case studies were considered a novel method of science education just 20 years ago, the case study teaching method has gained popularity in recent years among an array of scientific disciplines such as biology, chemistry, nursing, and psychology ( 5 – 7 , 9 , 11 , 13 , 15 – 17 , 21 , 22 , 24 ).

Although there is now a substantive and growing body of literature describing how to develop and use case studies in science teaching, current research on the effectiveness of case study teaching at meeting specific learning objectives is of limited scope and depth. Studies have shown that working in groups during completion of case studies significantly improves student perceptions of learning and may increase performance on assessment questions, and that the use of clickers can increase student engagement in case study activities, particularly among non-science majors, women, and freshmen ( 7 , 21 , 22 ). Case study teaching has been shown to improve exam performance in an anatomy and physiology course, increasing the mean score across all exams given in a two-semester sequence from 66% to 73% ( 5 ). Use of case studies was also shown to improve students’ ability to synthesize complex analytical questions about the real-world issues associated with a scientific topic ( 6 ). In a high school chemistry course, it was demonstrated that the case study teaching method produces significant increases in self-reported control of learning, task value, and self-efficacy for learning and performance ( 24 ). This effect on student motivation is important because enhanced motivation for learning activities has been shown to promote student engagement and academic performance ( 19 , 24 ). Additionally, faculty from a number of institutions have reported that using case studies promotes critical thinking, learning, and participation among students, especially in terms of the ability to view an issue from multiple perspectives and to grasp the practical application of core course concepts ( 23 ).

Despite what is known about the effectiveness of case studies in science education, questions remain about the functionality of the case study teaching method at promoting specific learning objectives that are important to many undergraduate biology courses. A recent survey of teachers who use case studies found that the topics most often covered in general biology courses included genetics and heredity, cell structure, cells and energy, chemistry of life, and cell cycle and cancer, suggesting that these topics should be of particular interest in studies that examine the effectiveness of the case study teaching method ( 8 ). However, the existing body of literature lacks direct evidence that the case study method is an effective tool for teaching about this collection of important topics in biology courses. Further, the extent to which case study teaching promotes development of science communication skills and the ability to understand the connections between biological concepts and everyday life has not been examined, yet these are core learning objectives shared by a variety of science courses. Although many instructors have produced case studies for use in their own classrooms, the production of novel case studies is time-consuming and requires skills that not all instructors have perfected. It is therefore important to determine whether case studies published by instructors who are unaffiliated with a particular course can be used effectively and obviate the need for each instructor to develop new case studies for their own courses. The results reported herein indicate that teaching with case studies results in significantly higher performance on examination questions about chemical bonds, osmosis and diffusion, mitosis and meiosis, and DNA structure and replication than that achieved by class discussions and textbook reading for topics of similar complexity. Case studies also increased overall student perceptions of learning gains and perceptions of learning gains specifically related to written and oral communication skills and the ability to grasp connections between scientific topics and their real-world applications. The effectiveness of the case study teaching method at increasing academic performance was not correlated to whether the case study used was authored by the instructor of the course or by an unaffiliated instructor. These findings support increased use of published case studies in the teaching of a variety of biological concepts and learning objectives.

Student population

This study was conducted at Kingsborough Community College, which is part of the City University of New York system, located in Brooklyn, New York. Kingsborough Community College has a diverse population of approximately 19,000 undergraduate students. The student population included in this study was enrolled in the first semester of a two-semester sequence of general (introductory) biology for biology majors during the spring, winter, or summer semester of 2014. A total of 63 students completed the course during this time period; 56 students consented to the inclusion of their data in the study. Of the students included in the study, 23 (41%) were male and 33 (59%) were female; 40 (71%) were registered as college freshmen and 16 (29%) were registered as college sophomores. To normalize participant groups, the same student population pooled from three classes taught by the same instructor was used to assess both experimental and control teaching methods.

Course material

The four biological concepts assessed during this study (chemical bonds, osmosis and diffusion, mitosis and meiosis, and DNA structure and replication) were selected as topics for studying the effectiveness of case study teaching because they were the key concepts addressed by this particular course that were most likely to be taught in a number of other courses, including biology courses for both majors and nonmajors at outside institutions. At the start of this study, relevant existing case studies were freely available from the National Center for Case Study Teaching in Science (NCCSTS) to address mitosis and meiosis and DNA structure and replication, but published case studies that appropriately addressed chemical bonds and osmosis and diffusion were not available. Therefore, original case studies that addressed the latter two topics were produced as part of this study, and case studies produced by unaffiliated instructors and published by the NCCSTS were used to address the former two topics. By the conclusion of this study, all four case studies had been peer-reviewed and accepted for publication by the NCCSTS ( http://sciencecases.lib.buffalo.edu/cs/ ). Four of the remaining core topics covered in this course (macromolecules, photosynthesis, genetic inheritance, and translation) were selected as control lessons to provide control assessment data.

To minimize extraneous variation, control topics and assessments were carefully matched in complexity, format, and number with case studies, and an equal amount of class time was allocated for each case study and the corresponding control lesson. Instruction related to control lessons was delivered using minimal slide-based lectures, with emphasis on textbook reading assignments accompanied by worksheets completed by students in and out of the classroom, and small and large group discussion of key points. Completion of activities and discussion related to all case studies and control topics that were analyzed was conducted in the classroom, with the exception of the take-home portion of the osmosis and diffusion case study.

Data collection and analysis

This study was performed in accordance with a protocol approved by the Kingsborough Community College Human Research Protection Program and the Institutional Review Board (IRB) of the City University of New York (CUNY IRB reference 539938-1; KCC IRB application #: KCC 13-12-126-0138). Assessment scores were collected from regularly scheduled course examinations. For each case study, control questions were included on the same examination that were similar in number, format, point value, and difficulty level, but related to a different topic covered in the course that was of similar complexity. Complexity and difficulty of both case study and control questions were evaluated using experiential data from previous iterations of the course; the Bloom’s taxonomy designation and amount of material covered by each question, as well as the average score on similar questions achieved by students in previous iterations of the course was considered in determining appropriate controls. All assessment questions were scored using a standardized, pre-determined rubric. Student perceptions of learning gains were assessed using a modified version of the Student Assessment of Learning Gains (SALG) course evaluation tool ( http://www.salgsite.org ), distributed in hardcopy and completed anonymously during the last week of the course. Students were presented with a consent form to opt-in to having their data included in the data analysis. After the course had concluded and final course grades had been posted, data from consenting students were pooled in a database and identifying information was removed prior to analysis. Statistical analysis of data was conducted using the Kruskal-Wallis one-way analysis of variance and calculation of the R 2 coefficient of determination.

Teaching with case studies improves performance on learning assessments, independent of case study origin

To evaluate the effectiveness of the case study teaching method at promoting learning, student performance on examination questions related to material covered by case studies was compared with performance on questions that covered material addressed through classroom discussions and textbook reading. The latter questions served as control items; assessment items for each case study were compared with control items that were of similar format, difficulty, and point value ( Appendix 1 ). Each of the four case studies resulted in an increase in examination performance compared with control questions that was statistically significant, with an average difference of 18% ( Fig. 1 ). The mean score on case study-related questions was 73% for the chemical bonds case study, 79% for osmosis and diffusion, 76% for mitosis and meiosis, and 70% for DNA structure and replication ( Fig. 1 ). The mean score for non-case study-related control questions was 60%, 54%, 60%, and 52%, respectively ( Fig. 1 ). In terms of examination performance, no significant difference between case studies produced by the instructor of the course (chemical bonds and osmosis and diffusion) and those produced by unaffiliated instructors (mitosis and meiosis and DNA structure and replication) was indicated by the Kruskal-Wallis one-way analysis of variance. However, the 25% difference between the mean score on questions related to the osmosis and diffusion case study and the mean score on the paired control questions was notably higher than the 13–18% differences observed for the other case studies ( Fig. 1 ).

An external file that holds a picture, illustration, etc.
Object name is jmbe-16-21f1.jpg

Case study teaching method increases student performance on examination questions. Mean score on a set of examination questions related to lessons covered by case studies (black bars) and paired control questions of similar format and difficulty about an unrelated topic (white bars). Chemical bonds, n = 54; Osmosis and diffusion, n = 54; Mitosis and meiosis, n = 51; DNA structure and replication, n = 50. Error bars represent the standard error of the mean (SEM). Asterisk indicates p < 0.05.

Case study teaching increases student perception of learning gains related to core course objectives

Student learning gains were assessed using a modified version of the SALG course evaluation tool ( Appendix 2 ). To determine whether completing case studies was more effective at increasing student perceptions of learning gains than completing textbook readings or participating in class discussions, perceptions of student learning gains for each were compared. In response to the question “Overall, how much did each of the following aspects of the class help your learning?” 82% of students responded that case studies helped a “good” or “great” amount, compared with 70% for participating in class discussions and 58% for completing textbook reading; only 4% of students responded that case studies helped a “small amount” or “provided no help,” compared with 2% for class discussions and 22% for textbook reading ( Fig. 2A ). The differences in reported learning gains derived from the use of case studies compared with class discussion and textbook readings were statistically significant, while the difference in learning gains associated with class discussion compared with textbook reading was not statistically significant by a narrow margin ( p = 0.051).

An external file that holds a picture, illustration, etc.
Object name is jmbe-16-21f2.jpg

The case study teaching method increases student perceptions of learning gains. Student perceptions of learning gains are indicated by plotting responses to the question “How much did each of the following activities: (A) Help your learning overall? (B) Improve your ability to communicate your knowledge of scientific concepts in writing? (C) Improve your ability to communicate your knowledge of scientific concepts orally? (D) Help you understand the connections between scientific concepts and other aspects of your everyday life?” Reponses are represented as follows: Helped a great amount (black bars); Helped a good amount (dark gray bars); Helped a moderate amount (medium gray bars); Helped a small amount (light gray bars); Provided no help (white bars). Asterisk indicates p < 0.05.

To elucidate the effectiveness of case studies at promoting learning gains related to specific course learning objectives compared with class discussions and textbook reading, students were asked how much each of these methods of content delivery specifically helped improve skills that were integral to fulfilling three main course objectives. When students were asked how much each of the methods helped “improve your ability to communicate knowledge of scientific concepts in writing,” 81% of students responded that case studies help a “good” or “great” amount, compared with 63% for class discussions and 59% for textbook reading; only 6% of students responded that case studies helped a “small amount” or “provided no help,” compared with 8% for class discussions and 21% for textbook reading ( Fig. 2B ). When the same question was posed about the ability to communicate orally, 81% of students responded that case studies help a “good” or “great” amount, compared with 68% for class discussions and 50% for textbook reading, while the respective response rates for helped a “small amount” or “provided no help,” were 4%, 6%, and 25% ( Fig. 2C ). The differences in learning gains associated with both written and oral communication were statistically significant when completion of case studies was compared with either participation in class discussion or completion of textbook readings. Compared with textbook reading, class discussions led to a statistically significant increase in oral but not written communication skills.

Students were then asked how much each of the methods helped them “understand the connections between scientific concepts and other aspects of your everyday life.” A total of 79% of respondents declared that case studies help a “good” or “great” amount, compared with 70% for class discussions and 57% for textbook reading ( Fig. 2D ). Only 4% stated that case studies and class discussions helped a “small amount” or “provided no help,” compared with 21% for textbook reading ( Fig. 2D ). Similar to overall learning gains, the use of case studies significantly increased the ability to understand the relevance of science to everyday life compared with class discussion and textbook readings, while the difference in learning gains associated with participation in class discussion compared with textbook reading was not statistically significant ( p = 0.054).

Student perceptions of learning gains resulting from case study teaching are positively correlated to increased performance on examinations, but independent of case study author

To test the hypothesis that case studies produced specifically for this course by the instructor were more effective at promoting learning gains than topically relevant case studies published by authors not associated with this course, perceptions of learning gains were compared for each of the case studies. For both of the case studies produced by the instructor of the course, 87% of students indicated that the case study provided a “good” or “great” amount of help to their learning, and 2% indicated that the case studies provided “little” or “no” help ( Table 1 ). In comparison, an average of 85% of students indicated that the case studies produced by an unaffiliated instructor provided a “good” or “great” amount of help to their learning, and 4% indicated that the case studies provided “little” or “no” help ( Table 1 ). The instructor-produced case studies yielded both the highest and lowest percentage of students reporting the highest level of learning gains (a “great” amount), while case studies produced by unaffiliated instructors yielded intermediate values. Therefore, it can be concluded that the effectiveness of case studies at promoting learning gains is not significantly affected by whether or not the course instructor authored the case study.

Case studies positively affect student perceptions of learning gains about various biological topics.

Finally, to determine whether performance on examination questions accurately predicts student perceptions of learning gains, mean scores on examination questions related to case studies were compared with reported perceptions of learning gains for those case studies ( Fig. 3 ). The coefficient of determination (R 2 value) was 0.81, indicating a strong, but not definitive, positive correlation between perceptions of learning gains and performance on examinations, suggesting that student perception of learning gains is a valid tool for assessing the effectiveness of case studies ( Fig. 3 ). This correlation was independent of case study author.

An external file that holds a picture, illustration, etc.
Object name is jmbe-16-21f3.jpg

Perception of learning gains but not author of case study is positively correlated to score on related examination questions. Percentage of students reporting that each specific case study provided “a great amount of help” to their learning was plotted against the point difference between mean score on examination questions related to that case study and mean score on paired control questions. Positive point differences indicate how much higher the mean scores on case study-related questions were than the mean scores on paired control questions. Black squares represent case studies produced by the instructor of the course; white squares represent case studies produced by unaffiliated instructors. R 2 value indicates the coefficient of determination.

The purpose of this study was to test the hypothesis that teaching with case studies produced by the instructor of a course is more effective at promoting learning gains than using case studies produced by unaffiliated instructors. This study also tested the hypothesis that the case study teaching method is more effective than class discussions and textbook reading at promoting learning gains associated with four of the most commonly taught topics in undergraduate general biology courses: chemical bonds, osmosis and diffusion, mitosis and meiosis, and DNA structure and replication. In addition to assessing content-based learning gains, development of written and oral communication skills and the ability to connect scientific topics with real-world applications was also assessed, because these skills were overarching learning objectives of this course, and classroom activities related to both case studies and control lessons were designed to provide opportunities for students to develop these skills. Finally, data were analyzed to determine whether performance on examination questions is positively correlated to student perceptions of learning gains resulting from case study teaching.

Compared with equivalent control questions about topics of similar complexity taught using class discussions and textbook readings, all four case studies produced statistically significant increases in the mean score on examination questions ( Fig. 1 ). This indicates that case studies are more effective than more commonly used, traditional methods of content delivery at promoting learning of a variety of core concepts covered in general biology courses. The average increase in score on each test item was equivalent to nearly two letter grades, which is substantial enough to elevate the average student performance on test items from the unsatisfactory/failing range to the satisfactory/passing range. The finding that there was no statistical difference between case studies in terms of performance on examination questions suggests that case studies are equally effective at promoting learning of disparate topics in biology. The observations that students did not perform significantly less well on the first case study presented (chemical bonds) compared with the other case studies and that performance on examination questions did not progressively increase with each successive case study suggests that the effectiveness of case studies is not directly related to the amount of experience students have using case studies. Furthermore, anecdotal evidence from previous semesters of this course suggests that, of the four topics addressed by cases in this study, DNA structure and function and osmosis and diffusion are the first and second most difficult for students to grasp. The lack of a statistical difference between case studies therefore suggests that the effectiveness of a case study at promoting learning gains is not directly proportional to the difficulty of the concept covered. However, the finding that use of the osmosis and diffusion case study resulted in the greatest increase in examination performance compared with control questions and also produced the highest student perceptions of learning gains is noteworthy and could be attributed to the fact that it was the only case study evaluated that included a hands-on experiment. Because the inclusion of a hands-on kinetic activity may synergistically enhance student engagement and learning and result in an even greater increase in learning gains than case studies that lack this type of activity, it is recommended that case studies that incorporate this type of activity be preferentially utilized.

Student perceptions of learning gains are strongly motivating factors for engagement in the classroom and academic performance, so it is important to assess the effect of any teaching method in this context ( 19 , 24 ). A modified version of the SALG course evaluation tool was used to assess student perceptions of learning gains because it has been previously validated as an efficacious tool ( Appendix 2 ) ( 20 ). Using the SALG tool, case study teaching was demonstrated to significantly increase student perceptions of overall learning gains compared with class discussions and textbook reading ( Fig. 2A ). Case studies were shown to be particularly useful for promoting perceived development of written and oral communication skills and for demonstrating connections between scientific topics and real-world issues and applications ( Figs. 2B–2D ). Further, student perceptions of “great” learning gains positively correlated with increased performance on examination questions, indicating that assessment of learning gains using the SALG tool is both valid and useful in this course setting ( Fig. 3 ). These findings also suggest that case study teaching could be used to increase student motivation and engagement in classroom activities and thus promote learning and performance on assessments. The finding that textbook reading yielded the lowest student perceptions of learning gains was not unexpected, since reading facilitates passive learning while the class discussions and case studies were both designed to promote active learning.

Importantly, there was no statistical difference in student performance on examinations attributed to the two case studies produced by the instructor of the course compared with the two case studies produced by unaffiliated instructors. The average difference between the two instructor-produced case studies and the two case studies published by unaffiliated instructors was only 3% in terms of both the average score on examination questions (76% compared with 73%) and the average increase in score compared with paired control items (14% compared with 17%) ( Fig. 1 ). Even when considering the inherent qualitative differences of course grades, these differences are negligible. Similarly, the effectiveness of case studies at promoting learning gains was not significantly affected by the origin of the case study, as evidenced by similar percentages of students reporting “good” and “great” learning gains regardless of whether the case study was produced by the course instructor or an unaffiliated instructor ( Table 1 ).

The observation that case studies published by unaffiliated instructors are just as effective as those produced by the instructor of a course suggests that instructors can reasonably rely on the use of pre-published case studies relevant to their class rather than investing the considerable time and effort required to produce a novel case study. Case studies covering a wide range of topics in the sciences are available from a number of sources, and many of them are free access. The National Center for Case Study Teaching in Science (NCCSTS) database ( http://sciencecases.lib.buffalo.edu/cs/ ) contains over 500 case studies that are freely available to instructors, and are accompanied by teaching notes that provide logistical advice and additional resources for implementing the case study, as well as a set of assessment questions with a password-protected answer key. Case study repositories are also maintained by BioQUEST Curriculum Consortium ( http://www.bioquest.org/icbl/cases.php ) and the Science Case Network ( http://sciencecasenet.org ); both are available for use by instructors from outside institutions.

It should be noted that all case studies used in this study were rigorously peer-reviewed and accepted for publication by the NCCSTS prior to the completion of this study ( 2 , 10 , 18 , 25 ); the conclusions of this study may not apply to case studies that were not developed in accordance with similar standards. Because case study teaching involves skills such as creative writing and management of dynamic group discussion in a way that is not commonly integrated into many other teaching methods, it is recommended that novice case study teachers seek training or guidance before writing their first case study or implementing the method. The lack of a difference observed in the use of case studies from different sources should be interpreted with some degree of caution since only two sources were represented in this study, and each by only two cases. Furthermore, in an educational setting, quantitative differences in test scores might produce meaningful qualitative differences in course grades even in the absence of a p value that is statistically significant. For example, there is a meaningful qualitative difference between test scores that result in an average grade of C− and test scores that result in an average grade of C+, even if there is no statistically significant difference between the two sets of scores.

In the future, it could be informative to confirm these findings using a larger cohort, by repeating the study at different institutions with different instructors, by evaluating different case studies, and by directly comparing the effectiveness of the case studying teaching method with additional forms of instruction, such as traditional chalkboard and slide-based lecturing, and laboratory-based activities. It may also be informative to examine whether demographic factors such as student age and gender modulate the effectiveness of the case study teaching method, and whether case studies work equally well for non-science majors taking a science course compared with those majoring in the subject. Since the topical material used in this study is often included in other classes in both high school and undergraduate education, such as cell biology, genetics, and chemistry, the conclusions of this study are directly applicable to a broad range of courses. Presently, it is recommended that the use of case studies in teaching undergraduate general biology and other science courses be expanded, especially for the teaching of capacious issues with real-world applications and in classes where development of written and oral communication skills are key objectives. The use of case studies that involve hands-on activities should be emphasized to maximize the benefit of this teaching method. Importantly, instructors can be confident in the use of pre-published case studies to promote learning, as there is no indication that the effectiveness of the case study teaching method is reliant on the production of novel, customized case studies for each course.

SUPPLEMENTAL MATERIALS

Acknowledgments.

This article benefitted from a President’s Faculty Innovation Grant, Kingsborough Community College. The author declares that there are no conflicts of interest.

† Supplemental materials available at http://jmbe.asm.org

  • University of Sussex website |
  • For staff |
  • For students

Learning Matters

Learning Matters

  • About Learning Matters
  • Case Studies
  • Submissions
  • A student led session on reviving curiosity and student engagement

case study about student engagement

First year student reps, Ismah Irsalina Binti Irwandy and Liv Camacho Wejbrandt, co-developed and delivered a workshop for lecturers at the Engineering and Informatics Teaching and Learning Away Day. Here they explain the part they played in developing the session, insights from their survey of students and staff on engaging teaching, and what they learned from delivering the session.  

Ismah and Liv are happy to share the resources used in the session and to advise staff and students from other schools on developing their own activities. 

What we did

In November 2023 we responded to a call from our school’s Director for Teaching and Learning (DTL), Dr Luis Ponce Cuspinero, asking for student representatives to develop and deliver a session on reviving curiosity and student engagement for the Engineering and Informatics School teaching and learning away day in January 2024.  

How we did it

Luis started by sharing the aims of the 50-minute session, which were to help lecturers understand the kinds of approaches to teaching and learning Engineering and Informatics students found most engaging and to encourage them to think about how they might better encourage their students’ curiosity and provide even more engaging teaching sessions. Ismah, who was the first to sign up, primarily worked with Luis on developing the questions and activities for the session. Liv joined a little later and led more on developing the presentation and delivery of the session. Planning meetings with Luis ran for between 15 to 30 minutes each week, over around 5 weeks. 

The first step was to develop a survey for Engineering and Informatics students to find out the kinds of teaching they find most engaging. Ismah brainstormed a long list of questions and, with Luis’ help, whittled them down to five, which were then put onto Poll Everywhere and sent to all students via the School Office.  (The questions are provided below). 

Our approach to designing the session was to make it interactive and engaging and to demonstrate how we like to be taught! The final session comprised three sections: 

(1) The ice breaker ‘reflective activity’: 

We developed a ‘pass the parcel’ style game, which was designed to get everyone energized and in the mood. Each table was given a bowl of folded paper slips, each printed with a prompt. Some were really simple, like ‘Describe a teacher that inspired you’, or, ‘Share an ‘aha’ moment you’ve have had while teaching’. Others were a bit more challenging, e.g.: ‘how would you re-design your module to make it more engaging?’ or, “Describe one of your modules as if to a 12-year-old”.  

On the day, we played music as the bowl was passed around the table and whoever was left holding it had a minute to pick out a slip and share their answer. At the end of the section, we asked people someone from each table to volunteer to share with the room their response to one of the questions.  

(2) The ‘How well do you know your students’ activity: 

We used Poll Everywhere to ask each of the five student survey questions to the room. After each question we reviewed the lecturer responses then shared the results from the student survey and briefly picked out where there were similarities and differences.  

(3) The ‘Embedding curiosity and engaging students’ activity 

We wanted to ensure lecturers were given a chance to apply insights from the first two activities so we then asked each table (team) to work together to embed curiosity and student engagement into a module.  One volunteer (the leader) from each table was to describe in brief (3 minutes) one of their modules, the teaching methods, type of assessment, and how feedback is provided. The team then had to come up with ideas/suggestions of how the module can be changed in order to make it more engaging and inspire curiosity in relation to: 

  • Teaching delivery (teaching methods) 
  • Assessment types 
  • Providing feedback 

We gave them 10 minutes to discuss then opened up the floor for team leaders to summarise their proposed changes.  

How it went

We got close to 70 responses to the student survey by the time of the away day (and have had more since!). We think it helped that we wrote the email and insisted the poll was at the top of the message (please do this poll – it will take 2 minutes) followed by the explanation.  

On the day the session went well. We played to our strengths (Liv is used to being on stage so took the lead) but it was really good to be doing it together.  We were concerned about balancing being fun and respectful, while also teaching challenging our lecturers. Happily, the audience were positive and the active approach to the session made it easier for us overall. However, it also meant we had to deal with unexpected outcomes and be confident in encouraging responses from the tables.   Also, while there were a few surprises in the outcomes of the student survey (including for us), it was great to see that there was also a lot of overlap and common ground.  

Liv concluded by impressing on lecturers to show their own love for their subjects and, for both of us, it was a rare opportunity to be able to say something we feel deeply about to lecturers.  

After the session we received lots of positive comments and had some great conversations, including with one lecturer who spoke with us for a long time asking about making his lectures more engaging. Also, we got a free lunch!  

Top Tips 

Our tips for other students are: 

  • It is definitely worth doing. Although it was a commitment at a busy time (we were studying for exams while developing the session), we felt the session had an impact and it made us feel like proper student representatives, particularly as, being first years, we hadn’t had many rep meetings by that point. 
  • You don’t have to start from scratch! We’re really happy for others to use and build on our approach and to chat with students and lecturers from other schools (see details of the activities from the session below and how to contact us).  

Comments and feedback 

“ I was incredibly impressed by Ismah and Liv’s contribution to the content and delivery of this session and have been busy encouraging Directors for Teaching and Learning from the other Sciences Schools I support to follow suit with their own students. My only regret is that Ismah and Liv’s session didn’t kick off the Teaching and Learning away day because it was a brilliant example of an engaging and active learning session which brought real energy to the day while providing that all-important student perspective .” (Dr Sam Hemsley, Academic Developer) 

Pass the parcel questions

Student survey questions

Please direct all queries to Luis Ponce Cuspinera .

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Recent Posts

  • Behold the Seminars: Reflections on Student Feedback
  • Assessment in a world of Generative AI: What might we lose?
  • The Evidence-Informed-Teaching Infographics Project: a novel and engaging way to communicate scholarship
  • Encouraging attendance and engagement through portfolio assessment
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • February 2023
  • January 2023
  • November 2022
  • September 2022
  • November 2021
  • October 2021
  • September 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020

Sussex links

  • University of Sussex website
  • Skip to main content
  • Skip to primary navigation

site logo

Design research lab studying physical robot interaction

A case study of student-community interaction through an education-first assistive device design class

Stuart, H. S., Torres, W. O., & McPherson, A. I.

Accepted for publication: More coming soon.

  • Open access
  • Published: 11 May 2024

Does a perceptual gap lead to actions against digital misinformation? A third-person effect study among medical students

  • Zongya Li   ORCID: orcid.org/0000-0002-4479-5971 1 &
  • Jun Yan   ORCID: orcid.org/0000-0002-9539-8466 1  

BMC Public Health volume  24 , Article number:  1291 ( 2024 ) Cite this article

312 Accesses

12 Altmetric

Metrics details

We are making progress in the fight against health-related misinformation, but mass participation and active engagement are far from adequate. Focusing on pre-professional medical students with above-average medical knowledge, our study examined whether and how third-person perceptions (TPP), which hypothesize that people tend to perceive media messages as having a greater effect on others than on themselves, would motivate their actions against misinformation.

We collected the cross-sectional data through a self-administered paper-and-pencil survey of 1,500 medical students in China during April 2022.

Structural equation modeling (SEM) analysis, showed that TPP was negatively associated with medical students’ actions against digital misinformation, including rebuttal of misinformation and promotion of corrective information. However, self-efficacy and collectivism served as positive predictors of both actions. Additionally, we found professional identification failed to play a significant role in influencing TPP, while digital misinformation self-efficacy was found to broaden the third-person perceptual gap and collectivism tended to reduce the perceptual bias significantly.

Conclusions

Our study contributes both to theory and practice. It extends the third-person effect theory by moving beyond the examination of restrictive actions and toward the exploration of corrective and promotional actions in the context of misinformation., It also lends a new perspective to the current efforts to counter digital misinformation; involving pre-professionals (in this case, medical students) in the fight.

Peer Review reports

Introduction

The widespread persistence of misinformation in the social media environment calls for effective strategies to mitigate the threat to our society [ 1 ]. Misinformation has received substantial scholarly attention in recent years [ 2 ], and solution-oriented explorations have long been a focus but the subject remains underexplored [ 3 ].

Health professionals, particularly physicians and nurses, are highly expected to play a role in the fight against misinformation as they serve as the most trusted information sources regarding medical topics [ 4 ]. However, some barriers, such as limitations regarding time and digital skills, greatly hinder their efforts to tackle misinformation on social media [ 5 ].

Medical students (i.e., college students majoring in health/medical science), in contrast to medical faculty, have a greater potential to become the major force in dealing with digital misinformation as they are not only equipped with basic medical knowledge but generally possess greater social media skills than the former generation [ 6 ]. Few studies, to our knowledge, have tried to explore the potential of these pre-professionals in tackling misinformation. Our research thus fills the gap by specifically exploring how these pre-professionals can be motivated to fight against digital health-related misinformation.

The third-person perception (TPP), which states that people tend to perceive media messages as having a greater effect on others than on themselves [ 7 ], has been found to play an important role in influencing individuals’ coping strategies related to misinformation. But empirical exploration from this line of studies has yielded contradictory results. Some studies revealed that individuals who perceived a greater negative influence of misinformation on others than on themselves were more likely to take corrective actions to debunk misinformation [ 8 ]. In contrast, some research found that stronger TPP reduced individuals’ willingness to engage in misinformation correction [ 9 , 10 ]. Such conflicting findings impel us to examine the association between the third-person perception and medical students’ corrective actions in response to misinformation, thus attempting to unveil the underlying mechanisms that promote or inhibit these pre-professionals’ engagement with misinformation.

Researchers have also identified several perceptual factors that motivate individuals’ actions against misinformation, especially efficacy-related concepts (e.g., self-efficacy and health literacy) and normative variables (e.g., subjective norms and perceived responsibility) [ 3 , 8 , 9 ]. However, most studies devote attention to the general population; little is known about whether and how these factors affect medical students’ intentions to deal with misinformation. We recruited Chinese medical students in order to study a social group that is mutually influenced by cultural norms (collectivism in Chinese society) and professional norms. Meanwhile, systematic education and training equip medical students with abundant clinical knowledge and good levels of eHealth literacy [ 5 ], which enable them to have potential efficacy in tackling misinformation. Our study thus aims to examine how medical students’ self-efficacy, cultural norms (i.e., collectivism) and professional norms (i.e., professional identification) impact their actions against misinformation.

Previous research has found self-efficacy to be a reliable moderator of optimistic bias, the tendency for individuals to consider themselves as less likely to experience negative events but more likely to experience positive events as compared to others [ 11 , 12 , 13 ]. As TPP is thought to be a product of optimistic bias, accordingly, self-efficacy should have the potential to influence the magnitude of third-person perception [ 14 , 15 ]. Meanwhile, scholars also suggest that the magnitude of TPP is influenced by social distance corollary [ 16 , 17 ]. Simply put, individuals tend to perceive those who are more socially distant from them to be more susceptible to the influence of undesirable media than those who are socially proximal [ 18 , 19 , 20 ]. From a social identity perspective, collectivism and professional identification might moderate the relative distance between oneself and others while the directions of such effects differ [ 21 , 22 ]. For example, collectivists tend to perceive a smaller social distance between self and others as “they are less likely to view themselves as distinct or unique from others” [ 23 ]. In contrast, individuals who are highly identified with their professional community (i.e., medical community) are more likely to perceive a larger social distance between in-group members (including themselves) and out-group members [ 24 ]. In this way, collectivism and professional identification might exert different effects on TPP. On this basis, this study aims to examine whether and how medical students’ perceptions of professional identity, self-efficacy and collectivism influence the magnitude of TPP and in turn influence their actions against misinformation.

Our study builds a model that reflects the theoretical linkages among self-efficacy, collectivism, professional identity, TPP, and actions against misinformation. The model, which clarifies the key antecedents of TPP and examines the mediating role of TPP, contribute to the third-person effect literature and offer practical contributions to countering digital misinformation.

Context of the study

As pre-professionals equipped with specialized knowledge and skills, medical students have been involved in efforts in health communication and promotion during the pandemic. For instance, thousands of medical students have participated in various volunteering activities in the fight against COVID-19, such as case data visualization [ 25 ], psychological counseling [ 26 ], and providing online consultations [ 27 ]. Due to the shortage of medical personnel and the burden of work, some medical schools also encouraged their students to participate in health care assistance in hospitals during the pandemic [ 28 , 29 ].

The flood of COVID-19 related misinformation has posed an additional threat to and burden on public health. We have an opportunity to address this issue and respond to the general public’s call for guidance from the medical community about COVID-19 by engaging medical students as a main force in the fight against coronavirus related misinformation.

Literature review

The third-person effect in the misinformation context.

Originally proposed by Davison [ 7 ], the third-person effect hypothesizes that people tend to perceive a greater effect of mass media on others than on themselves. Specifically, the TPE consists of two key components: the perceptual and the behavioral [ 16 ]. The perceptual component centers on the perceptual gap where individuals tend to perceive that others are more influenced by media messages than themselves. The behavioral component refers to the behavioral outcomes of the self-other perceptual gap in which people act in accordance with such perceptual asymmetry.

According to Perloff [ 30 ], the TPE is contingent upon situations. For instance, one general finding suggests that when media messages are considered socially undesirable, nonbeneficial, or involving risks, the TPE will get amplified [ 16 ]. Misinformation characterized as inaccurate, misleading, and even false, is regarded as undesirable in nature [ 31 ]. Based on this line of reasoning, we anticipate that people will tend to perceive that others would be more influenced by misinformation than themselves.

Recent studies also provide empirical evidence of the TPE in the context of misinformation [ 32 ]. For instance, an online survey of 511 Chinese respondents conducted by Liu and Huang [ 33 ] revealed that individuals would perceive others to be more vulnerable to the negative influence of COVID-19 digital disinformation. An examination of the TPE within a pre-professional group – the medical students–will allow our study to examine the TPE scholarship in a particular population in the context of tackling misinformation.

Why TPE occurs among medical students: a social identity perspective

Of the works that have provided explanations for the TPE, the well-known ones include self-enhancement [ 34 ], attributional bias [ 35 ], self-categorization theory [ 36 ], and the exposure hypothesis [ 19 ]. In this study, we argue for a social identity perspective as being an important explanation for third-person effects of misinformation among medical students [ 36 , 37 ].

The social identity explanation suggests that people define themselves in terms of their group memberships and seek to maintain a positive self-image through favoring the members of their own groups over members of an outgroup, which is also known as downward comparison [ 38 , 39 ]. In intergroup settings, the tendency to evaluate their ingroups more positively than the outgroups will lead to an ingroup bias [ 40 ]. Such an ingroup bias is typically described as a trigger for the third-person effect as individuals consider themselves and their group members superior and less vulnerable to undesirable media messages than are others and outgroup members [ 20 ].

In the context of our study, medical students highly identified with the medical community tend to maintain a positive social identity through an intergroup comparison that favors the ingroup and derogates the outgroup (i.e., the general public). It is likely that medical students consider themselves belonging to the medical community and thus are more knowledgeable and smarter than the general public in health-related topics, leading them to perceive the general public as more vulnerable to health-related misinformation than themselves. Accordingly, we propose the following hypothesis:

H1: As medical students’ identification with the medical community increases, the TPP concerning digital misinformation will become larger.

What influences the magnitude of TPP

Previous studies have demonstrated that the magnitude of the third-person perception is influenced by a host of factors including efficacy beliefs [ 3 ] and cultural differences in self-construal [ 22 , 23 ]. Self-construal is defined as “a constellation of thoughts, feelings, and actions concerning the relationship of the self to others, and the self as distinct from others” [ 41 ]. Markus and Kitayama (1991) identified two dimensions of self-construal: Independent and interdependent. Generally, collectivists hold an interdependent view of the self that emphasizes harmony, relatedness, and places importance on belonging, whereas individualists tend to have an independent view of the self and thus view themselves as distinct and unique from others [ 42 ]. Accordingly, cultural values such as collectivism-individualism should also play a role in shaping third-person perception due to the adjustment that people make of the self-other social identity distance [ 22 ].

Set in a Chinese context aiming to explore the potential of individual-level approaches to deal with misinformation, this study examines whether collectivism (the prevailing cultural value in China) and self-efficacy (an important determinant of ones’ behavioral intentions) would affect the magnitude of TPP concerning misinformation and how such impact in turn would influence their actions against misinformation.

The impact of self-efficacy on TPP

Bandura [ 43 ] refers to self-efficacy as one’s perceived capability to perform a desired action required to overcome barriers or manage challenging situations. He also suggests understanding self-efficacy as “a differentiated set of self-beliefs linked to distinct realms of functioning” [ 44 ]. That is to say, self-efficacy should be specifically conceptualized and operationalized in accordance with specific contexts, activities, and tasks [ 45 ]. In the context of digital misinformation, this study defines self-efficacy as one’s belief in his/her abilities to identify and verify misinformation within an affordance-bounded social media environment [ 3 ].

Previous studies have found self-efficacy to be a reliable moderator of biased optimism, which indicates that the more efficacious individuals consider themselves, the greater biased optimism will be invoked [ 12 , 23 , 46 ]. Even if self-efficacy deals only with one’s assessment of self in performing a task, it can still create the other-self perceptual gap; individuals who perceive a higher self-efficacy tend to believe that they are more capable of controlling a stressful or challenging situation [ 12 , 14 ]. As such, they are likely to consider themselves less vulnerable to negative events than are others [ 23 ]. That is, individuals with higher levels of self-efficacy tend to underestimate the impact of harmful messages on themselves, thereby widening the other-self perceptual gap.

In the context of fake news, which is closely related to misinformation, scholars have confirmed that fake news efficacy (i.e., a belief in one’s capability to evaluate fake news [ 3 ]) may lead to a larger third-person perception. Based upon previous research evidence, we thus propose the following hypothesis:

H2: As medical students’ digital misinformation self-efficacy increases, the TPP concerning digital misinformation will become larger.

The influence of collectivism on TPP

Originally conceptualized as a societal-level construct [ 47 ], collectivism reflects a culture that highlights the importance of collective goals over individual goals, defines the self in relation to the group, and places great emphasis on conformity, harmony and interdependence [ 48 ]. Some scholars propose to also examine cultural values at the individual level as culture is embedded within every individual and could vary significantly among individuals, further exerting effects on their perceptions, attitudes, and behaviors [ 49 ]. Corresponding to the construct at the macro-cultural level, micro-psychometric collectivism which reflects personality tendencies is characterized by an interdependent view of the self, a strong sense of other-orientation, and a great concern for the public good [ 50 ].

A few prior studies have indicated that collectivism might influence the magnitude of TPP. For instance, Lee and Tamborini [ 23 ] found that collectivism had a significant negative effect on the magnitude of TPP concerning Internet pornography. Such an impact can be understood in terms of biased optimism and social distance. Collectivists tend to view themselves as an integral part of a greater social whole and consider themselves less differentiated from others [ 51 ]. Collectivism thus would mitigate the third-person perception due to a smaller perceived social distance between individuals and other social members and a lower level of comparative optimism [ 22 , 23 ]. Based on this line of reasoning, we thus propose the following hypothesis:

H3: As medical students’ collectivism increases, the TPP concerning digital misinformation will become smaller.

Behavioral consequences of TPE in the misinformation context

The behavioral consequences trigged by TPE have been classified into three categories: restrictive actions refer to support for censorship or regulation of socially undesirable content such as pornography or violence on television [ 52 ]; corrective action is a specific type of behavior where people seek to voice their own opinions and correct the perceived harmful or ambiguous messages [ 53 ]; promotional actions target at media content with desirable influence, such as advocating for public service announcements [ 24 ]. In a word, restriction, correction and promotion are potential behavioral outcomes of TPE concerning messages with varying valence of social desirability [ 16 ].

Restrictive action as an outcome of third-person perceptual bias (i.e., the perceptual component of TPE positing that people tend to perceive media messages to have a greater impact on others than on themselves) has received substantial scholarly attention in past decades; scholars thus suggest that TPE scholarship to go beyond this tradition and move toward the exploration of corrective and promotional behaviors [ 16 , 24 ]. Moreover, individual-level corrective and promotional actions deserve more investigation specifically in the context of countering misinformation, as efforts from networked citizens have been documented as an important supplement beyond institutional regulations (e.g., drafting policy initiatives to counter misinformation) and platform-based measures (e.g., improving platform algorithms for detecting misinformation) [ 8 ].

In this study, corrective action specifically refers to individuals’ reactive behaviors that seek to rectify misinformation; these include such actions as debunking online misinformation by commenting, flagging, or reporting it [ 3 , 54 ]. Promotional action involves advancing correct information online, including in response to misinformation that has already been disseminated to the public [ 55 ].

The impact of TPP on corrective and promotional actions

Either paternalism theory [ 56 ] or the protective motivation theory [ 57 ] can act as an explanatory framework for behavioral outcomes triggered by third-person perception. According to these theories, people act upon TPP as they think themselves to know better and feel obligated to protect those who are more vulnerable to negative media influence [ 58 ]. That is, corrective and promotional actions as behavioral consequences of TPP might be driven by a protective concern for others and a positive sense of themselves.

To date, several empirical studies across contexts have examined the link between TPP and corrective actions. Koo et al. [ 8 ], for instance, found TPP was not only positively related to respondents’ willingness to correct misinformation propagated by others, but also was positively associated with their self-correction. Other studies suggest that TPP motivates individuals to engage in both online and offline corrective political participation [ 59 ], give a thumbs down to a biased story [ 60 ], and implement corrective behaviors concerning “problematic” TV reality shows [ 16 ]. Based on previous research evidence, we thus propose the following hypothesis:

H4: Medical students with higher degrees of TPP will report greater intentions to correct digital misinformation.

Compared to correction, promotional behavior has received less attention in the TPE research. Promotion commonly occurs in a situation where harmful messages have already been disseminated to the public and others appear to have been influenced by these messages, and it serves as a remedial action to amplify messages with positive influence which may in turn mitigate the detrimental effects of harmful messages [ 16 ].

Within this line of studies, however, empirical studies provide mixed findings. Wei and Golan [ 24 ] found a positive association between TPP of desirable political ads and promotional social media activism such as posting or linking the ad on their social media accounts. Sun et al. [ 16 ] found a negative association between TPP regarding clarity and community-connection public service announcements (PSAs) and promotion behaviors such as advocating for airing more PSAs in TV shows.

As promotional action is still underexplored in the TPE research, and existing evidence for the link between TPP and promotion is indeed mixed, we thus propose an exploratory research question:

RQ1: What is the relationship between TPP and medical students’ intentions to promote corrective information?

The impact of self-efficacy and collectivism on actions against misinformation

According to social cognitive theory, people with higher levels of self-efficacy tend to believe they are competent and capable and are more likely to execute specific actions [ 43 ]. Within the context of digital misinformation, individuals might become more willing to engage in misinformation correction if they have enough knowledge and confidence to evaluate information, and possess sufficient skills to verify information through digital tools and services [ 61 ].

Accordingly, we assumed medical students with higher levels of digital misinformation self-efficacy would be likely to become more active in the fight against misinformation.

H5: Medical students with higher levels of digital misinformation self-efficacy will report greater intentions to (a) correct misinformation and (b) promote corrective information on social media.

Social actions of collectivists are strongly guided by prevailing social norms, collective responsibilities, and common interest, goals, and obligations [ 48 ]. Hence, highly collectivistic individuals are more likely to self-sacrifice for group interests and are more oriented toward pro-social behaviors, such as adopting pro-environmental behaviors [ 62 ], sharing knowledge [ 23 ], and providing help for people in need [ 63 ].

Fighting against misinformation is also considered to comprise altruism, especially self-engaged corrective and promotional actions, as such actions are costly to the actor (i.e., taking up time and energy) but could benefit the general public [ 61 ]. Accordingly, we assume collectivism might play a role in prompting people to engage in reactive behaviors against misinformation.

It is also noted that collectivist values are deeply rooted in Chinese society and were especially strongly advocated during the outbreak of COVID-19 with an attempt to motivate prosocial behaviors [ 63 ]. Accordingly, we expected that the more the medical students were oriented toward collectivist values, the more likely they would feel personally obliged and normatively motivated to engage in misinformation correction. However, as empirical evidence was quite limited, we proposed exploratory research questions:

RQ2: Will medical students with higher levels of collectivism report greater intentions to (a) correct misinformation and (b) promote corrective information on social media?

The theoretical model

To integrate both the antecedents and consequences of TPP, we proposed a theoretical model (as shown in Fig. 1 ) to examine how professional identification, self-efficacy and collectivism would influence the magnitude of TPP, and how such impact would in turn influence medical students’ intentions to correct digital misinformation and promote corrective information. Thus, RQ3 was proposed:

RQ3: Will the TPP mediate the impact of self-efficacy and collectivism on medical students’ intentions to (a) correct misinformation, and (b) promote corrective information on social media? Fig. 1 The proposed theoretical model. DMSE = Digital Misinformation Self-efficacy; PIMC = Professional Identification with Medical Community; ICDM = Intention to Correct Digital Misinformation; IPCI = Intention to Promote Corrective Information Full size image

To examine the proposed hypotheses, this study utilized cross-sectional survey data from medical students in Tongji Medical College (TJMC) of China. TJMC is one of the birthplaces of Chinese modern medical education and among the first universities and colleges that offer eight-year curricula on clinical medicine. Further, TJMC is located in Wuhan, the epicenter of the initial COVID-19 outbreaks, thus its students might find the pandemic especially relevant – and threatening – to them.

The survey instrument was pilot tested using a convenience sample of 58 respondents, leading to minor refinements to a few items. Upon approval from the university’s Institutional Research Board (IRB), the formal investigation was launched in TJMC during April 2022. Given the challenges of reaching the whole target population and acquiring an appropriate sampling frame, this study employed purposive and convenience sampling.

We first contacted four school counselors as survey administrators through email with a letter explaining the objective of the study and requesting cooperation. All survey administrators were trained by the principal investigator to help with the data collection in four majors (i.e., basic medicine, clinical medicine, nursing, and public health). Paper-and-pencil questionnaires were distributed to students on regular weekly departmental meetings of each major as students in all grades (including undergraduates, master students, and doctoral students) were required to attend the meeting. The projected time of completion of the survey was approximately 10–15 min. The survey administrators indicated to students that participation was voluntary, their responses would remain confidential and secure, and the data would be used only for academic purposes. Though a total of 1,500 participants took the survey, 17 responses were excluded from the analysis as they failed the attention filters. Ultimately, a total of 1,483 surveys were deemed valid for analysis.

Of the 1,483 respondents, 624 (42.10%) were men and 855 (57.70%) were women, and four did not identify gender. The average age of the sample was 22.00 ( SD  = 2.54, ranging from 17 to 40). Regarding the distribution of respondents’ majors, 387 (26.10%) were in basic medicine, 390 (26.30%) in clinical medicine, 307 (20.70%) in nursing, and 399 (26.90%) in public health. In terms of university class, 1,041 (70.40%) were undergraduates, 291 (19.70%) were working on their master degrees, 146 (9.90%) were doctoral students, and five did not identify their class data.

Measurement of key variables

Perceived effects of digital misinformation on oneself and on others.

Three modified items adapted from previous research [ 33 , 64 ] were employed to measure perceived effects of digital misinformation on oneself. Respondents were asked to indicate to what extent they agreed with the following: (1) I am frequently concerned that the information about COVID-19 I read on social media might be false; (2) Misinformation on social media might misguide my understanding of the coronavirus; (3) Misinformation on social media might influence my decisions regarding COVID-19. The response categories used a 7-point scale, where 1 meant “strongly disagree” and 7 meant “strongly agree.” The measure of perceived effects of digital misinformation on others consisted of four parallel items with the same statement except replacing “I” and “my” with “the general others” and “their”. The three “self” items were averaged to create a measure of “perceived effects on oneself” ( M  = 3.98, SD  = 1.49, α  = 0.87). The three “others” items were also added and averaged to form an index of “perceived effects on others” ( M  = 4.62, SD  = 1.32, α  = 0.87).

The perceived self-other disparity (TPP)

TPP was derived by subtracting perceived effects on oneself from perceived effects on others.

Professional identification with medical community

Professional identification was measured using a three item, 7-point Likert-type scale (1 =  strongly disagree , 7 =  strongly agree ) adapted from previous studies [ 65 , 66 ] by asking respondents to indicate to what extent they agreed with the following statements: (1) I would be proud to be a medical staff member in the future; (2) I am committed to my major; and (3) I will be in an occupation that matches my current major. The three items were thus averaged to create a composite measure of professional identification ( M  = 5.34, SD  = 1.37, α  = 0.88).

Digital misinformation self-efficacy

Modified from previous studies [ 3 ], self-efficacy was measured with three items. Respondents were asked to indicate on a 7-point Linkert scale from 1 (strongly disagree) to 7 (strongly agree) their agreement with the following: (1) I think I can identify misinformation relating to COVID-19 on social media by myself; (2) I know how to verify misinformation regarding COVID-19 by using digital tools such as Tencent Jiaozhen Footnote 1 and Piyao.org.cn Footnote 2 ; (3) I am confident in my ability to identify digital misinformation relating to COVID-19. A composite measure of self-efficacy was constructed by averaging the three items ( M  = 4.38, SD  = 1.14, α  = 0.77).

  • Collectivism

Collectivism was measured using four items adapted from previous research [ 67 ], in which respondents were asked to indicate their agreement with the following statements on a 7-point scale, from 1 (strongly disagree) to 7 (strongly agree): (1) Individuals should sacrifice self-interest for the group; (2) Group welfare is more important than individual rewards; (3) Group success is more important than individual success; and (4) Group loyalty should be encouraged even if individual goals suffer. Therefore, the average of the four items was used to create a composite index of collectivism ( M  = 4.47, SD  = 1.30, α  = 0.89).

Intention to correct digital misinformation

We used three items adapted from past research [ 68 ] to measure respondents’ intention to correct misinformation on social media. All items were scored on a 7-point scale from 1 (very unlikely) to 7 (very likely): (1) I will post a comment saying that the information is wrong; (2) I will message the person who posts the misinformation to tell him/her the post is wrong; (3) I will track the progress of social media platforms in dealing with the wrong post (i.e., whether it’s deleted or corrected). A composite measure of “intention to correct digital misinformation” was constructed by adding the three items and dividing by three ( M  = 3.39, SD  = 1.43, α  = 0.81).

Intention to promote corrective information

On a 7-point scale ranging from 1 (very unlikely) to 7 (very likely), respondents were asked to indicate their intentions to (1) Retweet the corrective information about coronavirus on my social media account; (2) Share the corrective information about coronavirus with others through Social Networking Services. The two items were averaged to create a composite measure of “intention to promote corrective information” ( M  = 4.60, SD  = 1.68, r  = 0.77).

Control variables

We included gender, age, class (1 = undergraduate degree; 2 = master degree; 3 = doctoral degree), and clinical internship (0 = none; 1 = less than 0.5 year; 2 = 0.5 to 1.5 years; 3 = 1.5 to 3 years; 4 = more than 3 years) as control variables in the analyses. Additionally, coronavirus-related information exposure (i.e., how frequently they were exposed to information about COVID-19 on Weibo, WeChat, and QQ) and misinformation exposure on social media (i.e., how frequently they were exposed to misinformation about COVID-19 on Weibo, WeChat, and QQ) were also assessed as control variables because previous studies [ 69 , 70 ] had found them relevant to misinformation-related behaviors. Descriptive statistics and bivariate correlations between main variables were shown in Table 1 .

Statistical analysis

We ran confirmatory factor analysis (CFA) in Mplus (version 7.4, Muthén & Muthén, 1998) to ensure the construct validity of the scales. To examine the associations between variables and tested our hypotheses, we performed structural equation modeling (SEM). Mplus was chosen over other SEM statistical package mainly because the current data set included some missing data, and the Mplus has its strength in handling missing data using full-information maximum likelihood imputation, which enabled us to include all available data [ 71 , 72 ]. Meanwhile, Mplus also shows great flexibility in modelling when simultaneously handling continuous, categorical, observed, and latent variables in a variety of models. Further, Mplus provides a variety of useful information in a concise manner [ 73 ].

Table 2 shows the model fit information for the measurement and structural models. Five latent variables were specified in the measurement model. To test the measurement model, we examined the values of Cronbach’s alpha, composite reliability (CR), and average variance extracted (AVE) (Table 1 ). Cronbach’s alpha values ranged from 0.77 to 0.89. The CRs, which ranged from 0.78 to 0.91, exceeded the level of 0.70 recommended by Fornell (1982) and thus confirmed the internal consistency. The AVE estimates, which ranged from 0.54 to 0.78, exceeded the 0.50 lower limit recommended by Fornell and Larcker (1981), and thus supported convergent validity. All the square roots of AVE were greater than the off-diagonal correlations in the corresponding rows and columns [ 74 ]. Therefore, discriminant validity was assured. In a word, our measurement model showed sufficient convergence and discriminant validity.

Five model fit indices–the relative chi-square ratio (χ 2 / df ), the comparative fit index (CFI), the Tucker–Lewis index (TLI), the root mean square error of approximation (RMSEA), and the standardized root-mean-square residual (SRMR) were used to assess the model. Specifically, the normed chi-square between 1 and 5 is acceptable [ 75 ]. TLI and CFI over 0.95 are considered acceptable, SRMR value less than 0.08 and RMSEA value less than 0.06 indicate good fit [ 76 ]. Based on these criteria, the model was found to have an acceptable fit to the data.

Figure 2 presents the results of our hypothesized model. H1 was rejected as professional identification failed to predict TPP ( β  = 0.06, p  > 0.05). Self-efficacy was positively associated with TPP ( β  = 0.14, p  < 0.001) while collectivism was negatively related to TPP ( β  = -0.10, p  < 0.01), lending support to H2 and H3.

figure 2

Note. N  = 1,483. The coefficients of relationships between latent variables are standardized beta coefficients. Significant paths are indicated by solid line; non-significant paths are indicated by dotted lines. * p  < .05, ** p  < .01; *** p  < .001. DMSE = Digital Misinformation Self-efficacy; PIMC = Professional Identification with Medical Community; ICDM = Intention to Correct Digital Misinformation; IPCI = Intention to Promote Corrective Information

H4 posited that medical students with higher degrees of TPP would report greater intentions to correct digital misinformation. However, we found a negative association between TPP and intentions to correct misinformation ( β  = -0.12, p  < 0.001). H4 was thus rejected. Regarding RQ1, results revealed that TPP was negatively associated with intentions to promote corrective information ( β  = -0.08, p  < 0.05).

Further, our results supported H5 as we found that self-efficacy had a significant positive relationship with corrective intentions ( β  = 0.18, p  < 0.001) and promotional intentions ( β  = 0.32, p  < 0.001). Collectivism was also positively associated with intentions to correct misinformation ( β  = 0.14, p  < 0.001) and promote corrective information ( β  = 0.20, p  < 0.001), which answered RQ2.

Regarding RQ3 (see Table 3 ), TPP significantly mediated the relationship between self-efficacy and intentions to correct misinformation ( β  = -0.016), as well as the relationship between self-efficacy and intentions to promote corrective information ( β  = -0.011). However, TPP failed to mediate either the association between collectivism and corrective intentions ( β  = 0.011, ns ) or the association between collectivism and promotional intentions ( β  = 0.007, ns ).

Recent research has highlighted the role of health professionals and scientists in the fight against misinformation as they are considered knowledgeable, ethical, and reliable [ 5 , 77 ]. This study moved a step further by exploring the great potential of pre-professional medical students to tackle digital misinformation. Drawing on TPE theory, we investigated how medical students perceived the impact of digital misinformation, the influence of professional identification, self-efficacy and collectivism on these perceptions, and how these perceptions would in turn affect their actions against digital misinformation.

In line with prior studies [ 3 , 63 ], this research revealed that self-efficacy and collectivism played a significant role in influencing the magnitude of third-person perception, while professional identification had no significant impact on TPP. As shown in Table 1 , professional identification was positively associated with perceived effects of misinformation on oneself ( r  = 0.14, p  < 0.001) and on others ( r  = 0.20, p  < 0.001) simultaneously, which might result in a diminished TPP. What explains a shared or joint influence of professional identification on self and others? A potential explanation is that even medical staff had poor knowledge about the novel coronavirus during the initial outbreak [ 78 ]. Accordingly, identification with the medical community was insufficient to create an optimistic bias concerning identifying misinformation about COVID-19.

Our findings indicated that TPP was negatively associated with medical students’ intentions to correct misinformation and promote corrective information, which contradicted our hypotheses but was consistent with some previous TPP research conducted in the context of perceived risk [ 10 , 79 , 80 , 81 ]. For instance, Stavrositu and Kim (2014) found that increased TPP regarding cancer risk was negatively associated with behavioral intentions to engage in further cancer information search/exchange, as well as to adopt preventive lifestyle changes. Similarly, Wei et al. (2008) found concerning avian flu news that TPP negatively predicted the likelihood of engaging in actions such as seeking relevant information and getting vaccinated. In contrast, the perceived effects of avian flu news on oneself emerged as a positive predictor of intentions to take protective behavior.

Our study shows a similar pattern as perceived effects of misinformation on oneself were positively associated with intentions to correct misinformation ( r  = 0.06, p  < 0.05) and promote corrective information ( r  = 0.10, p  < 0.001, See Table 1 ). While the reasons for the behavioral patterns are rather elusive, such findings are indicative of human nature. When people perceive misinformation-related risk to be highly personally relevant, they do not take chances. However, when they perceive others to be more vulnerable than themselves, a set of sociopsychological dynamics such as self-defense mechanism, positive illusion, optimistic bias, and social comparison provide a restraint on people’s intention to engage in corrective and promotional actions against misinformation [ 81 ].

In addition to the indirect effects via TPP, our study also revealed that self-efficacy and collectivism serve as direct and powerful drivers of corrective and promotive actions. Consistent with previous literature [ 61 , 68 ], individuals will be more willing to engage in social corrections of misinformation if they possess enough knowledge, skills, abilities, and resources to identify misinformation, as correcting misinformation is difficult and their effort would not necessarily yield positive outcomes. Collectivists are also more likely to engage in misinformation correction as they are concerned for the public good and social benefits, aiming to protect vulnerable people from being misguided by misinformation [ 82 ].

This study offers some theoretical advancements. First, our study extends the TPE theory by moving beyond the examination of restrictive actions and toward the exploration of corrective and promotional actions in the context of misinformation. This exploratory investigation suggests that self-other asymmetry biased perception concerning misinformation did influence individuals’ actions against misinformation, but in an unexpected direction. The results also suggest that using TPP alone to predict behavioral outcomes was deficient as it only “focuses on differences between ‘self’ and ‘other’ while ignoring situations in which the ‘self’ and ‘other’ are jointly influenced” [ 83 ]. Future research, therefore, could provide a more sophisticated understanding of third-person effects on behavior by comparing the difference of perceived effects on oneself, perceived effects on others, and the third-person perception in the pattern and strength of the effects on behavioral outcomes.

Moreover, institutionalized corrective solutions such as government and platform regulation are non-exhaustive [ 84 , 85 ]; it thus becomes critical to tap the great potential of the crowd to engage in the fight against misinformation [ 8 ] while so far, research on the motivations underlying users’ active countering of misinformation has been scarce. The current paper helps bridge this gap by exploring the role of self-efficacy and collectivism in predicting medical students’ intentions to correct misinformation and promote corrective information. We found a parallel impact of the self-ability-related factor and the collective-responsibility-related factor on intentions to correct misinformation and promote corrective information. That is, in a collectivist society like China, cultivating a sense of collective responsibility and obligation in tackling misinformation (i.e., a persuasive story told with an emphasis on collective interests of social corrections of misinformation), in parallel with systematic medical education and digital literacy training (particularly, handling various fact-checking tools, acquiring Internet skills for information seeking and verification) would be effective methods to encourage medical students to engage in active countering behaviors against misinformation. Moreover, such an effective means of encouraging social corrections of misinformation might also be applied to the general public.

In practical terms, this study lends new perspectives to the current efforts in dealing with digital misinformation by involving pre-professionals (in this case, medical students) into the fight against misinformation. As digital natives, medical students usually spend more time online, have developed sophisticated digital competencies and are equipped with basic medical knowledge, thus possessing great potential in tackling digital misinformation. This study further sheds light on how to motivate medical students to become active in thwarting digital misinformation, which can help guide strategies to enlist pre-professionals to reduce the spread and threat of misinformation. For example, collectivism education in parallel with digital literacy training would help increase medical students’ sense of responsibility for and confidence in tackling misinformation, thus encouraging them to engage in active countering behaviors.

This study also has its limitations. First, the cross-sectional survey study did not allow us to justify causal claims. Granted, the proposed direction of causality in this study is in line with extant theorizing, but there is still a possibility of reverse causal relationships. To establish causality, experimental research or longitudinal studies would be more appropriate. Our second limitation lies in the generalizability of our findings. With the focus set on medical students in Chinese society, one should be cautious in generalizing the findings to other populations and cultures. For example, the effects of collectivism on actions against misinformation might differ in Eastern and Western cultures. Further studies would benefit from replication in diverse contexts and with diverse populations to increase the overall generalizability of our findings.

Drawing on TPE theory, our study revealed that TPP failed to motivate medical students to correct misinformation and promote corrective information. However, self-efficacy and collectivism were found to serve as direct and powerful drivers of corrective and promotive actions. Accordingly, in a collectivist society such as China’s, cultivating a sense of collective responsibility in tackling misinformation, in parallel with efficient personal efficacy interventions, would be effective methods to encourage medical students, even the general public, to actively engage in countering behaviors against misinformation.

Availability of data and materials

The datasets used and/or analyzed during the current study available from the corresponding author on reasonable request.

Tencent Jiaozhen Fact-Checking Platform which comprises the Tencent information verification tool allow users to check information authenticity through keyword searching. The tool is updated on a daily basis and adopts a human-machine collaboration approach to discovering, verifying, and refuting rumors and false information. For refuting rumors, Tencent Jiaozhen publishes verified content on the homepage of Tencent's rumor-refuting platform, and uses algorithms to accurately push this content to users exposed to the relevant rumors through the WeChat dispelling assistant.

Piyao.org.cn is hosted by the Internet Illegal Information Reporting Center under the Office of the Central Cyberspace Affairs Commission and operated by Xinhuanet.com. The platform is a website that collects statements from Twitter-like services, news portals and China's biggest search engine, Baidu, to refute online rumors and expose the scams of phishing websites. It has integrated over 40 local rumor-refuting platforms and uses artificial intelligence to identify rumors.

Dhawan D, Bekalu M, Pinnamaneni R, McCloud R, Viswanath K. COVID-19 news and misinformation: do they matter for public health prevention? J Health Commun. 2021;26:799–808.

Article   PubMed   Google Scholar  

Janmohamed K, Walter N, Nyhan K, Khoshnood K, Tucker JD, Sangngam N, et al. Interventions to mitigate COVID-19 misinformation: a systematic review and meta-analysis. J Health Commun. 2021;26:846–57.

Cheng Y, Chen ZF. The influence of presumed fake news influence: examining public support for corporate corrective response, media literacy interventions, and governmental regulation. Mass Commun Soc. 2020;23:705–29.

Article   Google Scholar  

Earnshaw VA, Katz IT. Educate, amplify, and focus to address COVID-19 misinformation. JAMA Health Forum. 2020;1:e200460.

Bautista JR, Zhang Y, Gwizdka J. Healthcare professionals’ acts of correcting health misinformation on social media. Int J Med Inf. 2021;148:104375.

O’Doherty D, Lougheed J, Hannigan A, Last J, Dromey M, O’Tuathaigh C, et al. Internet skills of medical faculty and students: is there a difference? BMC Med Educ. 2019;19:39.

Article   PubMed   PubMed Central   Google Scholar  

Davison WP. The third-person effect in communication.

Koo AZ-X, Su M-H, Lee S, Ahn S-Y, Rojas H. What motivates people to correct misinformation? Examining the effects of third-person perceptions and perceived norms. J Broadcast Electron Media. 2021;65:111–34.

Oktavianus J, Bautista JR. Motivating healthcare professionals to correct online health misinformation: the roles of subjective norm, third-person perception, and channel differences. Comput Hum Behav. 2023;147:107839.

Tang S, Willnat L, Zhang H. Fake news, information overload, and the third-person effect in China. Glob Media China. 2021;6:492–507.

Chapin J. Third-person perception and facebook. Int J Cyber Behav Psychol Learn. 2014;4:34–44.

Wei R, Lo V-H, Lu H-Y. Reconsidering the relationship between the third-person perception and optimistic bias. Commun Res. 2007;34:665–84.

Weinstein ND. Unrealistic optimism about future life events. J Pers Soc Psychol. 1980;39:802–20.

Liu X. Media exposure and third-person perception: the mediating role of social realism and proxy efficacy. 2021.

Yang J, Tian Y. “Others are more vulnerable to fake news than I Am”: Third-person effect of COVID-19 fake news on social media users. Comput Hum Behav. 2021;125:106950.

Sun Ye, Shen L, Pan Z. On the behavioral component of the third-person effect. Commun Res. 2008;35:257–78.

Article   CAS   Google Scholar  

Wei R, Lo V-H. The third-person effects of political attack ads in the 2004 U.S. Presidential election. Media Psychol. 2007;9:367–88.

Duck JM, Hogg MA, Terry DJ. Social identity and perceptions of media persuasion: are we always less influenced than others? 1. J Appl Soc Psychol. 1999;29(9):1879–99.

Eveland WP, Nathanson AI, Detenber BH, McLEOD DM. Rethinking the social distance corollary: perceived likelihood of expsoure and the third-person perception. Commun Res. 1999;26:275–302.

Scharrer E. Third-person perception and television violence: the role of out-group stereotyping in perceptions of susceptibility to effects. Commun Res. 2002;29:681–704.

Brownlee K, Halverson G, Chassie A. Multiple relationships: maintaining professional identity in rural social work practice. J Compar Soc Work. 2012;7(1):81–91.

Hogg MA, Reid SA. Social identity, self-categorization, and the communication of group norms. Commun Theory. 2006;16:7–30.

Lee B, Tamborini R. Third-person effect and internet pornography: the influence of collectivism and internet self-efficacy. J Commun. 2005;55:292–310.

Wei R, Golan G. Political advertising on social media in the 2012 presidential election: exploring the perceptual and behavioral components of the third-person effect. Electron News. 2013;7:223–42.

Dong E, Du H, Gardner L. An interactive web-based dashboard to track COVID-19 in real time. Lancet Infect Dis. 2020;20:533–4. 

Jun Z, Weili W, Xin Z, Wei Z. Recommended psychological crisis intervention response to the 2019 novel coronavirus pneumonia outbreak in China: a model of West China Hospital. Precis Clin Med. 2020;3(1):3–8.

Shi Y, Zhang S, Fan L, Sun T. What motivates medical students to engage in volunteer behavior during the COVID-19 Outbreak? A large cross-sectional survey. Front Psychol. 2021;11:569765.

Passemard S, Faye A, Dubertret C, Peyre H, Vorms C, Boimare V, ... & Ricard JD. Covid-19 crisis impact on the next generation of physicians: a survey of 800 medical students. BMC Med Educ. 2021;21(1):1–13.

Tempski P, Arantes-Costa FM, Kobayasi R, Siqueira MA, Torsani MB, Amaro BQ, Martins MA. Medical students’ perceptions and motivations during the COVID-19 pandemic. PLoS ONE. 2021;16(3):e0248627.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Perloff RM. Third-person effect research 1983–1992: a review and synthesis. Int J Public Opin Res. 1993;5:167–84.

Chen L, Fu L. Let’s fight the infodemic: the third-person effect process of misinformation during public health emergencies. Internet Res. 2022;32:1357–77.

Lee T. How people perceive influence of fake news and why it matters. Commun Q. 2021;69:431–53.

Liu PL, Huang LV. Digital disinformation about COVID-19 and the third-person effect: examining the channel differences and negative emotional outcomes. Cyberpsychol Behav Soc Netw. 2020;23:789–93.

Gunther AC, Thorson E. Perceived persuasive effects of product commercials and public service announcements: third-person effects in new domains. Commun Res. 1992;19:574–96.

Gunther A. What we think others think: cause and consequence in the third-person effect. Commun Res. 1991;18:355–72.

Reid SA, Hogg MA. A self-categorization explanation for the third-person effect. Hum Commun Res. 2005;31:129–61.

Cao B, Chen Z, Huang Y, Lo WH. Conflict between Mainland Chinese and Hong Kongers: a social identity perspective in explaining the hostile media phenomenon and the third-person effect. J Appl J Media Stud. 2014;3:225–40.

Google Scholar  

Tajfel H. Experimental studies of intergroup behaviour. In: Cognitive analysis of social behavior: Proceedings of the NATO advanced study Institute on “The cognitive analysis of socio-psychological processes”, Aix-enProvence, France, July 12–31, 1981 Dordrecht: Springer Netherlands; 1982. p. 227–46.

Chapter   Google Scholar  

Tajfel H, Turner JC, Austin WG, Worchel S. An integrative theory of intergroup conflict. Organizational Identity. 1979;56(65):9780203505984–9780203505916.

Crocker J, Luhtanen R. Collective self-esteem and ingroup bias. J Pers Soc Psychol. 1990;58(1).

Singelis TM. The measurement of independent and interdependent self-construals. Pers Soc Psychol Bull. 1994;20(5):580–91.

Cho H, Lee JS. The influence of self-efficacy, subjective norms, and risk perception on behavioral intentions related to the H1N1 flu pandemic: a comparison between K orea and the US. Asian J Soc Psychol. 2015;18(4):311–24.

Bandura A, Freeman WH, Lightsey R. Self-efficacy: the exercise of control. J Cogn Psychother. 1999;13:158–66.

Bandura A, Guide for constructing self-efficacy scales. Self-efficacy beliefs of adolescents. 2006;5(1):307–37.

Pajares F. Self-efficacy beliefs in academic settings. Rev Educ Res. 1996;66:543–78.

Park JS, Ahn HY, Haley EJ. Optimistic bias, advertising skepticism, and consumer intentions for seeking information about the health risks of prescription medicine. Health Mark Q. 2017;34(2):81–96.

Hofstede GH. Culture’s consequences: comparing values, behaviors, institutions, and organizations across nations. 2nd ed. Thousand Oaks: Sage Publications; 2001.

Triandis HC. Individualism and Collectivism. 1st ed. New York: Routledge; 2018.

Wated G, Sanchez JI. Managerial tolerance of nepotism: the effects of individualism-collectivism in a Latin American Context. J Bus Ethics. 2015;130:45–57.

Markus HR, Kitayama S. Culture and the self."Implications for cognition, emotion, and motivation.

Sullivan D, Landau MJ, Kay AC, Rothschild ZK. Collectivism and the meaning of suffering. J Pers Soc Psychol. 2012;103:1023–39.

Lo V, Wei R. Third-person effect, gender, and pornography on the lnternet. J Broadcast Electron Media. 2002;46:13–33.

Barnidge M, Rojas H. Hostile media perceptions, presumed media influence, and political talk: expanding the corrective action hypothesis. Int J Public Opin Res. 2014;26:135–56.

Wintterlin F, Frischlich L, Boberg S, Schatto-Eckrodt T, Reer F, Quandt T. Corrective Actions in the information disorder. the role of presumed media influence and hostile media perceptions for the countering of distorted user-generated content. Polit Commun. 2021;38:773–91.

Wei R, Lo V-H, Lu H-Y, Hou H-Y. Examining multiple behavioral effects of third-person perception: evidence from the news about Fukushima nuclear crisis in Taiwan. Chin J Commun. 2015;8:95–111.

McLEOD DM, Eveland WP, Nathanson AI. Support for censorship of violent and misogynic rap lyrics: an analysis of the third-person effect. Commun Res. 1997;24:153–74.

Nathanson AI, Eveland WP Jr, Park H-S, Paul B. Perceived media influence and efficacy as predictors of caregivers’ protective behaviors. J Broadcast Electron Media. 2002;46:385–410.

McLeod DM, Detenber BH, Eveland WP. Behind the third-person effect: differentiating perceptual processes for self and other. J Commun. 2001;51:678–95.

Rojas H. “Corrective” Actions in the public sphere: how perceptions of media and media effects shape political behaviors. Int J Public Opin Res. 2010;22:343–63.

Chung M, Munno GJ, Moritz B. Triggering participation: exploring the effects of third-person and hostile media perceptions on online participation. Comput Hum Behav. 2015;53:452–61.

Zhao L, Yin J, Song Y. An exploration of rumor combating behavior on social media in the context of social crises. Comput Hum Behav. 2016;58:25–36.

Sherman DK, Updegraff JA, Handy MS, Eom K, Kim HS. Beliefs and social norms as precursors of environmental support: the joint influence of collectivism and socioeconomic status. Pers Soc Psychol Bull. 2022;48:463–77.

Zhu Y, Wei R, Lo V-H, Zhang M, Li Z. Collectivism and altruistic behavior: a third-person effect study of COVID-19 news among Wuhan residents. Glob Media China. 2021;6:476–91.

Yang F, Horning M. Reluctant to share: how third person perceptions of fake news discourage news readers from sharing “real news” on social media. Soc Media Soc. 2020;6:205630512095517.

Adams K, Hean S, Sturgis P, Clark JM. Investigating the factors influencing professional identity of first-year health and social care students. Learn Health Soc Care. 2006;5(2):55–68.

丁汉青, 王军. 冲突与协调: 传媒从业者后备军职业认同状况研究——以北京某高校新闻学院在校生为例. 国际新闻界, 2019;2 :113–131.

Yoo B, Donthu N, Lenartowicz T. Measuring Hofstede’s five dimensions of cultural values at the individual level: development and validation of. J Int Consum Mark. 2011;23(3-4):193-210.

Tandoc EC, Lim D, Ling R. Diffusion of disinformation: how social media users respond to fake news and why. Journalism. 2020;21:381–98.

Tan ASL, Lee C, Chae J. Exposure to health (Mis)Information: lagged effects on young adults’ health behaviors and potential pathways. J Commun. 2015.

Tully M, Bode L, Vraga EK. Mobilizing users: does exposure to misinformation and its correction affect users’ responses to a health misinformation post? Soc Media Soc. 2020;6:205630512097837.

Arbuckle JL. Full information estimation in the presence of in complete data. In: Marcoulides GA, Schumaker RE, editors. Advanced structural equation modeling: issues and techniques. Mahwah: Erlbaum; 1996. p. 243–77.

Narayanan A. A review of eight software packages for structural equation modeling. Am Stat. 2012;66(2):129–38.

Sakaria D, Maat SM, Mohd Matore MEE. Examining the optimal choice of SEM statistical software packages for sustainable mathematics education: a systematic review. Sustainability. 2023;15(4):3209.

Fornell C, Larcker DF. Evaluating structural equation models with unobservable variables and measurement error. J Mark Res. 1981;18(1):39–50.

Wheaton B, Muthen B, Alwin DF, Summers GF. Assessing reliability and stability in panel models. Sociol Methodol. 1977;8:84–136.

Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Modeling. 1999;6(1):1–55.

Ho SS, Goh TJ, Leung YW. Let’s nab fake science news: predicting scientists’ support for interventions using the influence of presumed media influence model. Journalism. 2022;23:910–28.

Bhagavathula AS, Aldhaleei WA, Rahmani J, Mahabadi MA, Bandari DK. Knowledge and perceptions of COVID-19 among health care workers: cross-sectional study. JMIR Public Health Surveill. 2020;6(2):e19160.

Jung EH, Zhang L, Nekmat E. SNS usage and third-person effects in the risk perception of Zika virus among Singaporean Women. J Health Commun. 2020;25:736–44.

Stavrositu CD, Kim J. Social media metrics: third-person perceptions of health information. Comput Hum Behav. 2014;35:61–7.

Wei R, Lo VH, Lu HY. Third-person effects of health news: exploring the relationships among media exposure, presumed media influence, and behavioral intentions. Am Behav Sci. 2008;52:261–77.

Hong SC. Presumed effects of “fake news” on the global warming discussion in a cross-cultural context. Sustainability. 2020;12(5).

Neuwirth K, Frederick E. Extending the framework of third-, first-, and second-person effects. Mass Commun Soc. 2002;5:113–40.

Bastick Z. Would you notice if fake news changed your behavior? An experiment on the unconscious effects of disinformation. Comput Hum Behav. 2021;116.

Harff D, Bollen C, Schmuck D. Responses to social media influencers’ misinformation about COVID-19: a pre-registered multiple-exposure experiment. Media Psychol. 2022;25:831–50.

Download references

Acknowledgements

We thank all participants and staff working for the project.

This work was supported by Humanities and Social Sciences Youth Foundation of the Ministry of Education of China (Grant No. 21YJC860012).

Author information

Authors and affiliations.

Journalism and Information Communication School, Huazhong University of Science and Technology, Wuhan, Hubei, China

Zongya Li & Jun Yan

You can also search for this author in PubMed   Google Scholar

Contributions

Zongya Li wrote the main manuscript and Jun yan collected the data. All authors reviewed the manuscript.

Corresponding author

Correspondence to Jun Yan .

Ethics declarations

Ethics approval and consent to participate.

This study was approved by the Medical Ethics Committee of Union Hospital Affiliated to Tongji Medical College, Huazhong University of Science and Technology (approval number: 2022S009). All the participants provided informed consent to engage in this research.

Consent for publication

The authors give their consent for the publication of identifiable details, which can include photograph(s) and/or videos and/or case history and/or details within the manuscript to be published in the BMC Public Health.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Li, Z., Yan, J. Does a perceptual gap lead to actions against digital misinformation? A third-person effect study among medical students. BMC Public Health 24 , 1291 (2024). https://doi.org/10.1186/s12889-024-18763-9

Download citation

Received : 08 December 2023

Accepted : 02 May 2024

Published : 11 May 2024

DOI : https://doi.org/10.1186/s12889-024-18763-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital misinformation
  • Third-person perception
  • Pre-professionals
  • Professional identification

BMC Public Health

ISSN: 1471-2458

case study about student engagement

IMAGES

  1. Student Engagement Strategies For The Online Learning Environment

    case study about student engagement

  2. (PDF) An experimental case study on forum-based online teaching to

    case study about student engagement

  3. (PDF) University–community engagement: Case study of university social

    case study about student engagement

  4. Focus on Student Engagement for Better Academic Outcomes

    case study about student engagement

  5. Conceptual overview of Student Engagement

    case study about student engagement

  6. (PDF) STUDENTS' ENGAGEMENT IN RELATIONSHIP TO ACADEMIC PERFORMANCE

    case study about student engagement

VIDEO

  1. A Case Study of Community Engagement and Health Promotion

  2. Student Case Study Student Success Story Cheng&Elea www thinkpropertyclub com au Short Clip YT

  3. Cyscogram_case_study_student_radiographer

  4. Student Case Study Student Success Story Andrey www.thinkpropertyclub.com.au Short Clip YT

  5. Employee Engagement Case Study

  6. The Student Experience: Making sense of data in a complex higher education era

COMMENTS

  1. PDF How Motivation Influences Student Engagement: A Qualitative Case Study

    A Qualitative Case Study. Sitwat Saeed1 & David Zyngier1 1 Faculty of Education Monash University, Melbourne Australia Correspondence: David Zyngier, Faculty of Education, Monash University, PO Box 527 Frankston 3199, Australia. Tel: 61-399-044-320. E-mail: [email protected].

  2. Full article: Fostering student engagement through a real-world

    Missing is a holistic study of student engagement in a multifaceted, interdisciplinary learning context that incorporates many different best practices that existing studies have identified as ideal for engaging students. ... The effects of different types of case learning on student engagement. International Studies Perspectives, 11, 291-308 ...

  3. Fostering student engagement with motivating teaching: an observation

    Although student engagement is a multidimensional concept, observational studies to date have commonly used aggregated measures as indicators of student engagement (Jang, Reeve, and Deci Citation 2010; Reeve et al. Citation 2004; Van den Berghe et al. Citation 2016). However, lumping different indicators of student engagement together, ignores ...

  4. A study of the relationship between students' engagement and their

    Kuh (2003) developed the National Survey of Student Engagement (NSSE) benchmarks to evaluate students' engagement through their skills, emotion, interaction and performance, ... In the context of activity-based learning designs as is the case in this study, the application learning behaviours become significant elements to contribute to the ...

  5. A balancing act: a window into online student engagement experiences

    This article reports on a qualitative study which explored online student engagement experiences in a higher education institution. There are very few studies providing in-depth perspectives on the engagement experiences of online students. The project adopted a case study approach, following 24 online students over one academic year. The setting for the study was an undergraduate online ...

  6. How Motivation Influences Student Engagement: A Qualitative Case Study

    As our aim was to get in depth information from students and their teacher about students' and teachers' perceptions of their intrinsic and extrinsic motivation and any link to students engagement, we chose one "bounded case" (Creswell, 2007, p.74)—a school class—as an instrumental case study to explain that issue (Merriam, 2002).

  7. A Case Study Integrating Edutainment to Enhance Student Engagement and

    The purpose of this study was to examine the relationship between undergraduate student course engagement and several independent variables. Total participants included 300 (N) undergraduate students.

  8. Supporting students to engage with case studies: a model of engagement

    The findings from all stages allowed a model for case study teaching to be developed, as a guide for educators. The key considerations for educators are that applicable, relevant and real-life case studies effectively support engagement and learning. Furthermore, focused case studies are preferred, with greater depth than breadth.

  9. Undergraduate student engagement at a Chinese university: a case study

    Student engagement in higher education has attracted worldwide attention in recent years because of its strong correlation with positive outcomes of student learning and also, increasingly, because of its influence on a consumer-oriented global education market. Such issues come into sharp focus in the case of China, currently the largest international market for higher education in Western ...

  10. Case Studies of Student Engagement and Partnership

    Professional Development and Graduate Attributes within Pharmacy Technician Studies. This project engaged students in the development of graduate attributes, leading to collaborative charitable and skills development initiatives. Authors: Tao Zhang, Julie Dunne, Kathy Young, Gemma Kinsella, and Seána Hogan. Technological University Dublin.

  11. PDF Student Engagement: A Case Study of the Relationships Between Student

    The case presented in this paper suggests a delivery-centred approach to improving retention. This study suggests that where lecturers and students have dissonant conceptions of student engagement in learning, retention is poor and where such conceptions are congruent, student retention is higher. This study used a survey to identify student ...

  12. Teacher behaviour and student engagement with L2 writing feedback: a

    Feedback is essential for student learning and engagement is key for its efficacy. Yet research on student engagement with feedback predominantly attributes it to learner factors, overlooking teacher influence. This case study explored how one writing teacher's behaviours shaped a motivated undergraduate's engagement with various types of feedback in a writing course over one semester.

  13. (PDF) Student Engagement Case Studies

    The challenge of student engagement has been recognised as a serious issue, especially in the middle years of schooling in Australian education. This qualitative study seeks to understand the experiences of one group of students beginning their high school years. Students are often left out of the discourse on student engagement.

  14. Student Engagement Case Studies

    Student Engagement Case Studies. These case studies provide examples of student engagement initiatives in UK institutions. They include interventions both inside and outside the classroom and represent a range of approaches to "engagement" and to fostering that engagement. They have been compiled by Department of Educational Research University ...

  15. [PDF] A mixed method case study of student engagement, technology use

    A mixed method case study of student engagement, technology use and high school success @inproceedings{Jacobsen2011AMM, title={A mixed method case study of student engagement, technology use and high school success}, author={Michele Jacobsen and Sharon Friesen and Jason Daniels and Stanley Varnhagen and Judy Donovan}, year={2011}, url={https ...

  16. The community of inquiry as a tool for measuring student engagement in

    The study explores student engagement in blended MOOCs using the RCoI paradigm, enriching the theoretical landscape. ... These activities may include case studies, problem-solving tasks, and assignments that undergo peer review. Course structures can promote active discourse and introspection, fostering a more profound comprehension and ...

  17. Case Study Teaching Method Improves Student Performance and Perceptions

    Studies have shown that working in groups during completion of case studies significantly improves student perceptions of learning and may increase performance on assessment questions, and that the use of clickers can increase student engagement in case study activities, particularly among non-science majors, women, and freshmen ( 7, 21, 22 ).

  18. Student engagement with computer-generated feedback: A case study

    on 22 November 2017. in future research. This study looks at how an EFL student engaged with. computer-generated feedback across three dimensions: behavioural, emotional, and cognitive. Although ...

  19. A student led session on reviving curiosity and student engagement

    Luis started by sharing the aims of the 50-minute session, which were to help lecturers understand the kinds of approaches to teaching and learning Engineering and Informatics students found most engaging and to encourage them to think about how they might better encourage their students' curiosity and provide even more engaging teaching ...

  20. A case study of student-community interaction through an education

    Design research lab studying physical robot interaction. Research; Publications; Facilities; People; Teaching; For Members; A case study of student-community interaction through an education-first assistive device design class

  21. Does a perceptual gap lead to actions against digital misinformation? A

    Background We are making progress in the fight against health-related misinformation, but mass participation and active engagement are far from adequate. Focusing on pre-professional medical students with above-average medical knowledge, our study examined whether and how third-person perceptions (TPP), which hypothesize that people tend to perceive media messages as having a greater effect on ...

  22. PowerSchool Schoology Learning

    PowerBuddy for Learning. PowerBuddy for Learning is the personal assistant for teaching and learning. PowerBuddy makes educators' lives easier by helping them easily create high-quality assignments and instructional content. Students benefit from an always-available personalized assistant to support them in the way they choose to learn.

  23. Exchange Programs

    Bureau of Educational and Cultural Affairs Exchange Programs. Please select what type of information you are looking for: Opportunities for Non-U.S. Citizens. Opportunities for U.S. Citizens. Find U.S. Department of State programs for U.S. and non-U.S. citizens wishing to participate in cultural, educational, or professional exchanges.