• Review article
  • Open access
  • Published: 22 January 2020

Mapping research in student engagement and educational technology in higher education: a systematic evidence map

  • Melissa Bond   ORCID: orcid.org/0000-0002-8267-031X 1 ,
  • Katja Buntins 2 ,
  • Svenja Bedenlier 1 ,
  • Olaf Zawacki-Richter 1 &
  • Michael Kerres 2  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  2 ( 2020 ) Cite this article

120k Accesses

252 Citations

58 Altmetric

Metrics details

Digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience. It has also been linked to an increase in behavioural, affective and cognitive student engagement, the facilitation of which is a central concern of educators. In order to delineate the complex nexus of technology and student engagement, this article systematically maps research from 243 studies published between 2007 and 2016. Research within the corpus was predominantly undertaken within the United States and the United Kingdom, with only limited research undertaken in the Global South, and largely focused on the fields of Arts & Humanities, Education, and Natural Sciences, Mathematics & Statistics. Studies most often used quantitative methods, followed by mixed methods, with little qualitative research methods employed. Few studies provided a definition of student engagement, and less than half were guided by a theoretical framework. The courses investigated used blended learning and text-based tools (e.g. discussion forums) most often, with undergraduate students as the primary target group. Stemming from the use of educational technology, behavioural engagement was by far the most often identified dimension, followed by affective and cognitive engagement. This mapping article provides the grounds for further exploration into discipline-specific use of technology to foster student engagement.

Introduction

Over the past decade, the conceptualisation and measurement of ‘student engagement’ has received increasing attention from researchers, practitioners, and policy makers alike. Seminal works such as Astin’s ( 1999 ) theory of involvement, Fredricks, Blumenfeld, and Paris’s ( 2004 ) conceptualisation of the three dimensions of student engagement (behavioural, emotional, cognitive), and sociocultural theories of engagement such as Kahu ( 2013 ) and Kahu and Nelson ( 2018 ), have done much to shape and refine our understanding of this complex phenomenon. However, criticism about the strength and depth of student engagement theorising remains e.g. (Boekaerts, 2016 ; Kahn, 2014 ; Zepke, 2018 ), the quality of which has had a direct impact on the rigour of subsequent research (Lawson & Lawson, 2013 ; Trowler, 2010 ), prompting calls for further synthesis (Azevedo, 2015 ; Eccles, 2016 ).

In parallel to this increased attention on student engagement, digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience (Barak, 2018 ; Henderson, Selwyn, & Aston, 2017 ; Selwyn, 2016 ). International recognition of the importance of ICT skills and digital literacy has been growing, alongside mounting recognition of its importance for active citizenship (Choi, Glassman, & Cristol, 2017 ; OECD, 2015a ; Redecker, 2017 ), and the development of interdisciplinary and collaborative skills (Barak & Levenberg, 2016 ; Oliver, & de St Jorre, Trina, 2018 ). Using technology has the potential to make teaching and learning processes more intensive (Kerres, 2013 ), improve student self-regulation and self-efficacy (Alioon & Delialioğlu, 2017 ; Bouta, Retalis, & Paraskeva, 2012 ), increase participation and involvement in courses as well as the wider university community (Junco, 2012 ; Salaber, 2014 ), and predict increased student engagement (Chen, Lambert, & Guidry, 2010 ; Rashid & Asghar, 2016 ). There is, however, no guarantee of active student engagement as a result of using technology (Kirkwood, 2009 ), with Tamim, Bernard, Borokhovski, Abrami, and Schmid’s ( 2011 ) second-order meta-analysis finding only a small to moderate impact on student achievement across 40 years. Rather, careful planning, sound pedagogy and appropriate tools are vital (Englund, Olofsson, & Price, 2017 ; Koehler & Mishra, 2005 ; Popenici, 2013 ), as “technology can amplify great teaching, but great technology cannot replace poor teaching” (OECD, 2015b ), p. 4.

Due to the nature of its complexity, educational technology research has struggled to find a common definition and terminology with which to talk about student engagement, which has resulted in inconsistency across the field. For example, whilst 77% of articles reviewed by Henrie, Halverson, and Graham ( 2015 ) operationalised engagement from a behavioural perspective, most of the articles did not have a clearly defined statement of engagement, which is no longer considered acceptable in student engagement research (Appleton, Christenson, & Furlong, 2008 ; Christenson, Reschly, & Wylie, 2012 ). Linked to this, educational technology research has, however, lacked theoretical guidance (Al-Sakkaf, Omar, & Ahmad, 2019 ; Hew, Lan, Tang, Jia, & Lo, 2019 ; Lundin, Bergviken Rensfeldt, Hillman, Lantz-Andersson, & Peterson, 2018 ). A review of 44 random articles published in 2014 in the journals Educational Technology Research & Development and Computers & Education, for example, revealed that more than half had no guiding conceptual or theoretical framework (Antonenko, 2015 ), and only 13 out of 62 studies in a systematic review of flipped learning in engineering education reported theoretical grounding (Karabulut-Ilgu, Jaramillo Cherrez, & Jahren, 2018 ). Therefore, calls have been made for a greater understanding of the role that educational technology plays in affecting student engagement, in order to strengthen teaching practice and lead to improved outcomes for students (Castañeda & Selwyn, 2018 ; Krause & Coates, 2008 ; Nelson Laird & Kuh, 2005 ).

A reflection upon prior research that has been undertaken in the field is a necessary first step to engage in meaningful discussion on how to foster student engagement in the digital age. In support of this aim, this article provides a synthesis of student engagement theory research, and systematically maps empirical higher education research between 2007 and 2016 on student engagement in educational technology. Synthesising the vast body of literature on student engagement (for previous literature and systematic reviews, see Additional file  1 ), this article develops “a tentative theory” in the hopes of “plot[ting] the conceptual landscape…[and chart] possible routes to explore it” (Antonenko, 2015 , pp. 57–67) for researchers, practitioners, learning designers, administrators and policy makers. It then discusses student engagement against the background of educational technology research, exploring prior literature and systematic reviews that have been undertaken. The systematic review search method is then outlined, followed by the presentation and discussion of findings.

Literature review

What is student engagement.

Student engagement has been linked to improved achievement, persistence and retention (Finn, 2006 ; Kuh, Cruce, Shoup, Kinzie, & Gonyea, 2008 ), with disengagement having a profound effect on student learning outcomes and cognitive development (Ma, Han, Yang, & Cheng, 2015 ), and being a predictor of student dropout in both secondary school and higher education (Finn & Zimmer, 2012 ). Student engagement is a multifaceted and complex construct (Appleton et al., 2008 ; Ben-Eliyahu, Moore, Dorph, & Schunn, 2018 ), which some have called a ‘meta-construct’ (e.g. Fredricks et al., 2004 ; Kahu, 2013 ), and likened to blind men describing an elephant (Baron & Corbin, 2012 ; Eccles, 2016 ). There is ongoing disagreement about whether there are three components e.g., (Eccles, 2016 )—affective/emotional, cognitive and behavioural—or whether there are four, with the recent suggested addition of agentic engagement (Reeve, 2012 ; Reeve & Tseng, 2011 ) and social engagement (Fredricks, Filsecker, & Lawson, 2016 ). There has also been confusion as to whether the terms ‘engagement’ and ‘motivation’ can and should be used interchangeably (Reschly & Christenson, 2012 ), especially when used by policy makers and institutions (Eccles & Wang, 2012 ). However, the prevalent understanding across the literature is that motivation is an antecedent to engagement; it is the intent and unobservable force that energises behaviour (Lim, 2004 ; Reeve, 2012 ; Reschly & Christenson, 2012 ), whereas student engagement is energy and effort in action; an observable manifestation (Appleton et al., 2008 ; Eccles & Wang, 2012 ; Kuh, 2009 ; Skinner & Pitzer, 2012 ), evidenced through a range of indicators.

Whilst it is widely accepted that no one definition exists that will satisfy all stakeholders (Solomonides, 2013 ), and no one project can be expected to possibly examine every sub-construct of student engagement (Kahu, 2013 ), it is important for each research project to begin with a clear definition of their own understanding (Boekaerts, 2016 ). Therefore, in this project, student engagement is defined as follows:

Student engagement is the energy and effort that students employ within their learning community, observable via any number of behavioural, cognitive or affective indicators across a continuum. It is shaped by a range of structural and internal influences, including the complex interplay of relationships, learning activities and the learning environment. The more students are engaged and empowered within their learning community, the more likely they are to channel that energy back into their learning, leading to a range of short and long term outcomes, that can likewise further fuel engagement.

Dimensions and indicators of student engagement

There are three widely accepted dimensions of student engagement; affective, cognitive and behavioural. Within each component there are several indicators of engagement (see Additional file  2 ), as well as disengagement (see Additional file 2 ), which is now seen as a separate and distinct construct to engagement. It should be stated, however, that whilst these have been drawn from a range of literature, this is not a finite list, and it is recognised that students might experience these indicators on a continuum at varying times (Coates, 2007 ; Payne, 2017 ), depending on their valence (positive or negative) and activation (high or low) (Pekrun & Linnenbrink-Garcia, 2012 ). There has also been disagreement in terms of which dimension the indicators align with. For example, Järvelä, Järvenoja, Malmberg, Isohätälä, and Sobocinski ( 2016 ) argue that ‘interaction’ extends beyond behavioural engagement, covering both cognitive and/or emotional dimensions, as it involves collaboration between students, and Lawson and Lawson ( 2013 ) believe that ‘effort’ and ‘persistence’ are cognitive rather than behavioural constructs, as they “represent cognitive dispositions toward activity rather than an activity unto itself” (p. 465), which is represented in the table through the indicator ‘stay on task/focus’ (see Additional file 2 ). Further consideration of these disagreements represent an area for future research, however, as they are beyond the scope of this paper.

Student engagement within educational technology research

The potential that educational technology has to improve student engagement, has long been recognised (Norris & Coutas, 2014 ), however it is not merely a case of technology plus students equals engagement. Without careful planning and sound pedagogy, technology can promote disengagement and impede rather than help learning (Howard, Ma, & Yang, 2016 ; Popenici, 2013 ). Whilst still a young area, most of the research undertaken to gain insight into this, has been focused on undergraduate students e.g., (Henrie et al., 2015 ; Webb, Clough, O’Reilly, Wilmott, & Witham, 2017 ), with Chen et al. ( 2010 ) finding a positive relationship between the use of technology and student engagement, particularly earlier in university study. Research has also been predominantly STEM and medicine focused (e.g., Li, van der Spek, Feijs, Wang, & Hu, 2017 ; Nikou & Economides, 2018 ), with at least five literature or systematic reviews published in the last 5 years focused on medicine, and nursing in particular (see Additional file  3 ). This indicates that further synthesis is needed of research in other disciplines, such as Arts & Humanities and Education, as well as further investigation into whether research continues to focus on undergraduate students.

The five most researched technologies in Henrie et al.’s ( 2015 ) review were online discussion boards, general websites, learning management systems (LMS), general campus software and videos, as opposed to Schindler, Burkholder, Morad, and Marsh’s ( 2017 ) literature review, which concentrated on social networking sites (Facebook and Twitter), digital games, wikis, web-conferencing software and blogs. Schindler et al. found that most of these technologies had a positive impact on multiple indicators of student engagement across the three dimensions of engagement, with digital games, web-conferencing software and Facebook the most effective. However, it must be noted that they only considered seven indicators of student engagement, which could be extended by considering further indicators of student engagement. Other reviews that have found at least a small positive impact on student engagement include those focused on audience response systems (Hunsu, Adesope, & Bayly, 2016 ; Kay & LeSage, 2009 ), mobile learning (Kaliisa & Picard, 2017 ), and social media (Cheston, Flickinger, & Chisolm, 2013 ). Specific indicators of engagement that increased as a result of technology include interest and enjoyment (Li et al., 2017 ), improved confidence (Smith & Lambert, 2014 ) and attitudes (Nikou & Economides, 2018 ), as well as enhanced relationships with peers and teachers e.g., (Alrasheedi, Capretz, & Raza, 2015 ; Atmacasoy & Aksu, 2018 ).

Literature and systematic reviews focused on student engagement and technology do not always include information on where studies have been conducted. Out of 27 identified reviews (see Additional file 3 ), only 14 report the countries included, and two of these were explicitly focused on a specific region or country, namely Africa and Turkey. Most of the research has been conducted in the USA, followed by the UK, Taiwan, Australia and China. Table  1 depicts the three countries from which most studies originated from in the respective reviews, and highlights a clear lack of research conducted within mainland Europe, South America and Africa. Whilst this could be due to the choice of databases in which the literature was searched for, this nevertheless highlights a substantial gap in the literature, and to that end, it will be interesting to see whether this review is able to substantiate or contradict these trends.

Research into student engagement and educational technology has predominantly used a quantitative methodology (see Additional file 3 ), with 11 literature and systematic reviews reporting that surveys, particularly self-report Likert-scale, are the most used source of measurement (e.g. Henrie et al., 2015 ). Reviews that have included research using a range of methodologies, have found a limited number of studies employing qualitative methods (e.g. Connolly, Boyle, MacArthur, Hainey, & Boyle, 2012 ; Kay & LeSage, 2009 ; Lundin et al., 2018 ). This has led to a call for further qualitative research to be undertaken, exploring student engagement and technology, as well as more rigorous research designs e.g., (Li et al., 2017 ; Nikou & Economides, 2018 ), including sampling strategies, data collection, and in experimental studies in particular (Cheston et al., 2013 ; Connolly et al., 2012 ). However, not all reviews included information on methodologies used. Crook ( 2019 ), in his recent editorial in the British Journal of Educational Technology , stated that research methodology is a “neglected topic” (p. 487) within educational technology research, and stressed its importance in order to conduct studies delving deeper into phenomena (e.g. longitudinal studies).

Therefore, this article presents an initial “evidence map” (Miake-Lye, Hempel, Shanman, & Shekelle, 2016 ), p. 19 of systematically identified literature on student engagement and educational technology within higher education, undertaken through a systematic review, in order to address the issues raised by prior research, and to identify research gaps. These issues include the disparity between field of study and study levels researched, the geographical distribution of studies, the methodologies used, and the theoretical fuzziness surrounding student engagement. This article, however, is intended to provide an initial overview of the systematic review method employed, as well as an overview of the overall corpus. Further synthesis of possible correlations between student engagement and disengagement indicators with the co-occurrence of technology tools, will be undertaken within field of study specific articles (e.g., Bedenlier, 2020b ; Bedenlier 2020a ), allowing more meaningful guidance on applying the findings in practice.

The following research questions guide this enquiry:

How do the studies in the sample ground student engagement and align with theory?

Which indicators of cognitive, behavioural and affective engagement were identified in studies where educational technology was used? Which indicators of student disengagement?

What are the learning scenarios, modes of delivery and educational technology tools employed in the studies?

Overview of the study

With the intent to systematically map empirical research on student engagement and educational technology in higher education, we conducted a systematic review. A systematic review is an explicitly and systematically conducted literature review, that answers a specific question through applying a replicable search strategy, with studies then included or excluded, based on explicit criteria (Gough, Oliver, & Thomas, 2012 ). Studies included for review are then coded and synthesised into findings that shine light on gaps, contradictions or inconsistencies in the literature, as well as providing guidance on applying findings in practice. This contribution maps the research corpus of 243 studies that were identified through a systematic search and ensuing random parameter-based sampling.

Search strategy and selection procedure

The initial inclusion criteria for the systematic review were peer-reviewed articles in the English language, empirically reporting on students and student engagement in higher education, and making use of educational technology. The search was limited to records between 1995 and 2016, chosen due to the implementation of the first Virtual Learning Environments and Learning Management Systems within higher education see (Bond, 2018 ). Articles were limited to those published in peer-reviewed journals, due to the rigorous process under which they are published, and their trustworthiness in academia (Nicholas et al., 2015 ), although concerns within the scientific community with the peer-review process are acknowledged e.g. (Smith, 2006 ).

Discussion arose on how to approach the “hard-to-detect” (O’Mara-Eves et al., 2014 , p. 51) concept of student engagement in regards to sensitivity versus precision (Brunton, Stansfield, & Thomas, 2012 ), particularly in light of engagement being Henrie et al.’s ( 2015 ) most important search term. The decision was made that the concept ‘student engagement’ would be identified from titles and abstracts at a later stage, during the screening process. In this way, it was assumed that articles would be included, which indeed are concerned with student engagement, but which use different terms to describe the concept. Given the nature of student engagement as a meta-construct e.g. (Appleton et al., 2008 ; Christenson et al., 2012 ; Kahu, 2013 ) and by limiting the search to only articles including the term engagement , important research on other elements of student engagement might be missed. Hence, we opted for recall over precision. According to Gough et al. ( 2012 ), p. 13 “electronic searching is imprecise and captures many studies that employ the same terms without sharing the same focus”, or would lead to disregarding studies that analyse the construct but use different terms to describe it.

With this in mind, the search strategy to identify relevant studies was developed iteratively with support from the University Research Librarian. As outlined in O’Mara-Eves et al. ( 2014 ) as a standard approach, we used reviewer knowledge—in this case strongly supported through not only reviewer knowledge but certified expertise—and previous literature (e.g. Henrie et al., 2015 ; Kahu, 2013 ) to elicit concepts with potential importance under the topics student engagement, higher education and educational technology . The final search string (see Fig.  1 ) encompasses clusters of different educational technologies that were searched for separately in order to avoid an overly long search string. It was decided not to include any brand names, e.g. Facebook, Twitter, Moodle etc. because it was again reasoned that in scientific publication, the broader term would be used (e.g. social media). The final search string was slightly adapted, e.g. the format required for truncations or wildcards, according to the settings of each database being used Footnote 1 .

figure 1

Final search terms used in the systematic review

Four databases (ERIC, Web of Science, Scopus and PsycINFO) were searched in July 2017 and three researchers and a student assistant screened abstracts and titles of the retrieved references between August and November 2017, using EPPI Reviewer 4.0. An initial 77,508 references were retrieved, and with the elimination of duplicate records, 53,768 references remained (see Fig.  2 ). A first cursory screening of records revealed that older research was more concerned with technologies that are now considered outdated (e.g. overhead projectors, floppy disks). Therefore, we opted to adjust the period to include research published between 2007 and 2016, labeled as a phase of research and practice, entitled ‘online learning in the digital age’ (Bond, 2018 ). Whilst we initially opted for recall over precision, the decision was then made to search for specific facets of the student engagement construct (e.g. deep learning, interest and persistence) within EPPI-Reviewer, in order to further refine the corpus. These adaptations led to a remaining 18,068 records.

figure 2

Systematic review PRISMA flow chart (slightly modified after Brunton et al., 2012 , p. 86; Moher, Liberati, Tetzlaff, & Altman, 2009 ), p. 8

Four researchers screened the first 150 titles and abstracts, in order to iteratively establish a joint understanding of the inclusion criteria. The remaining references were distributed equally amongst the screening team, which resulted in the inclusion of 4152 potentially relevant articles. Given the large number of articles for screening on full text, whilst facing restrained time as a condition in project-based and funded work, it was decided that a sample of articles would be drawn from this corpus for further analysis. With the intention to draw a sample that estimates the population parameters with a predetermined error range, we used methods of sample size estimation in the social sciences (Kupper & Hafner, 1989 ). To do so, the R Package MBESS (Kelley, Lai, Lai, & Suggests, 2018 ) was used. Accepting a 5% error range, a percentage of a half and an alpha of 5%, 349 articles were sampled, with this sample being then stratified by publishing year, as student engagement has become much more prevalent (Zepke, 2018 ) and educational technology has become more differentiated within the last decade (Bond, 2018 ). Two researchers screened the first 100 articles on full text, reaching an agreement of 88% on inclusion/exclusion. The researchers then discussed the discrepancies and came to an agreement on the remaining 12%. It was decided that further comparison screening was needed, to increase the level of reliability. After screening the sample on full text, 232 articles remained for data extraction, which contained 243 studies.

Data extraction process

In order to extract the article data, an extensive coding system was developed, including codes to extract information on the set-up and execution of the study (e.g. methodology, study sample) as well as information on the learning scenario, the mode of delivery and educational technology used. Learning scenarios included broader pedagogies, such as social collaborative learning and self-determined learning, but also specific pedagogies such as flipped learning, given the increasing number of studies and interest in these approaches (e.g., Lundin et al., 2018 ). Specific examples of student engagement and/or disengagement were coded under cognitive, affective or behavioural (dis)engagement. The facets of student (dis)engagement were identified based on the literature review undertaken (see Additional file 2 ), and applied in this detailed manner to not only capture the overarching dimensions of the concept, but rather their diverse sub-meanings. New indicators also emerged during the coding process, which had not initially been identified from the literature review, including ‘confidence’ and ‘assuming responsibility’. The 243 studies were coded with this extensive code set and any disagreements that occurred between the coders were reconciled. Footnote 2

As a plethora of over 50 individual educational technology applications and tools were identified in the 243 studies, in line with results found in other large-scale systematic reviews (e.g., Lai & Bower, 2019 ), concerns were raised over how the research team could meaningfully analyse and report the results. The decision was therefore made to employ Bower’s ( 2016 ) typology of learning technologies (see Additional file  4 ), in order to channel the tools into groups that share the same characteristics or “structure of information” (Bower, 2016 ), p. 773. Whilst it is acknowledged that some of the technology could be classified into more than one type within the typology, e.g. wikis can be used in individual composition, for collaborative tasks, or for knowledge organisation and sharing, “the type of learning that results from the use of the tool is dependent on the task and the way people engage with it rather than the technology itself” therefore “the typology is presented as descriptions of what each type of tool enables and example use cases rather than prescriptions of any particular pedagogical value system” (Bower, 2016 ), p. 774. For further elaboration on each category, please see Bower ( 2015 ).

Study characteristics

Geographical characteristics.

The systematic mapping reveals that the 243 studies were set in 33 different countries, whilst seven studies investigated settings in an international context, and three studies did not indicate their country setting. In 2% of the studies, the country was allocated based on the author country of origin, if the two authors came from the same country. The top five countries account for 158 studies (see Fig.  3 ), with 35.4% ( n  = 86) studies conducted in the United States (US), 10.7% ( n  = 26) in the United Kingdom (UK), 7.8% ( n  = 19) in Australia, 7.4% ( n  = 18) in Taiwan, and 3.7% ( n  = 9) in China. Across the corpus, studies from countries employing English as the official or one of the official languages total up to 59.7% of the entire sample, followed by East Asian countries that in total account for 18.8% of the sample. With the exception of the UK, European countries are largely absent from the sample, only 7.3% of the articles originate from this region, with countries such as France, Belgium, Italy or Portugal having no studies and countries such as Germany or the Netherlands having one respectively. Thus, with eight articles, Spain is the most prolific European country outside of the UK. The geographical distribution of study settings also clearly shows an almost complete absence of studies undertaken within African contexts, with five studies from South Africa and one from Tunisia. Studies from South-East Asia, the Middle East, and South America are likewise low in number this review. Whilst the global picture evokes an imbalance, this might be partially due to our search and sampling strategy, having focused on English language journals, indexed in four primarily Western-focused databases.

figure 3

Percentage deviation from the average relative frequencies of the different data collection formats per country (≥ 3 articles). Note. NS = not stated; AUS = Australia; CAN = Canada; CHN = China; HKG = Hong Kong; inter = international; IRI = Iran; JAP = Japan; MYS = Malaysia; SGP = Singapore; ZAF = South Africa; KOR = South Korea; ESP = Spain; SWE = Sweden; TWN = Taiwan; TUR = Turkey; GBR = United Kingdom; USA = United States of America

Methodological characteristics

Within this literature corpus, 103 studies (42%) employed quantitative methods, 84 (35%) mixed methods, and 56 (23%) qualitative. Relating these numbers back to the contributing countries, different preferences for and frequencies of methods used become apparent (see Fig. 3 ). As a general tendency, mixed methods and qualitative research occurs more often in Western countries, whereas quantitative research is the preferred method in East Asian countries. For example, studies originating from Australia employ mixed methods research 28% more often than the average, whereas Singapore is far below average in mixed methods research, with 34.5% less than the other countries in the sample. In Taiwan, on the other hand, mixed methods studies are being conducted 23.5% below average and qualitative research 6.4% less often than average. However, quantitative research occurs more often than in other countries, with 29.8% above average.

Amongst the qualitative studies, qualitative content analysis ( n  = 30) was the most frequently used analysis approach, followed by thematic analysis ( n  = 21) and grounded theory ( n  = 12). However, a lot of times ( n  = 37) the exact analysis approach was not reported, could not be allocated to a specific classification ( n  = 22), or no method of analysis was identifiable ( n  = 11). Within studies using quantitative methods, mean comparison was used in 100 studies, frequency data was collected and analysed in 83 studies, and in 40 studies regression models were used. Furthermore, looking at the correlation between the different analysis approaches, only one significant correlation can be identified, this being between mean comparison and frequency data (−.246). Besides that, correlations are small, for example, in only 14% of the studies both mean comparisons and regressions models are employed.

Study population characteristics

Research in the corpus focused on universities as the prime institution type ( n  = 191, 79%), followed by 24 (10%) non-specified institution types, and colleges ( n  = 21, 8.2%) (see Fig.  4 ). Five studies (2%) included institutions classified as ‘other’, and two studies (0.8%) included both college and university students. The most frequently studied student population was undergraduate students (60%, n  = 146), as opposed to 33 studies (14%) focused on postgraduate students (see Fig.  6 ). A combination of undergraduate and postgraduate students were the subject of interest in 23 studies (9%), with 41 studies (17%) not specifying the level of study of research participants.

figure 4

Relative frequencies of study field in dependence of countries with ≥3 articles. Note. Country abbreviations are as per Figure 4. A&H = Arts & Humanities; BA&L = Business, Administration and Law; EDU = Education; EM&C = Engineering, Manufacturing & Construction; H&W = Health & Welfare; ICT = Information & Communication Technologies; ID = interdisciplinary; NS,M&S = Natural Science, Mathematics & Statistics; NS = Not specified; SoS = Social Sciences, Journalism & Information

Based on the UNESCO (2015) ISCED classification, eight broad study fields are covered in the sample, with Arts & Humanities (42 studies), Education (42 studies), and Natural Sciences, Mathematics & Statistics (37) being the top three study fields, followed by Health & Welfare (30 studies), Social Sciences, Journalism & Information (22), Business, Administration & Law (19 studies), Information & Communication Technologies (13), Engineering, Manufacturing & Construction (11), and another 26 studies of interdisciplinary character. One study did not specify a field of study.

An expectancy value was calculated, according to which, the distribution of studies per discipline should occur per country. The actual deviation from this value then showed that several Asian countries are home to more articles in the field of Arts & Humanities than was expected: Japan with 3.3 articles more, China with 5.4 and Taiwan with 5.9. Furthermore, internationally located research also shows 2.3 more interdisciplinary studies than expected, whereas studies on Social Sciences occur more often than expected in the UK (5.7 more articles) and Australia (3.3 articles) but less often than expected across all other countries. Interestingly, the USA have 9.9 studies less in Arts & Humanities than was expected but 5.6 articles more than expected in Natural Science.

Question One: How do the studies in the sample ground student engagement and align with theory?

Defining student engagement.

It is striking that almost all of the studies ( n  = 225, 93%) in this corpus lack a definition of student engagement, with only 18 (7%) articles attempting to define the concept. However, this is not too surprising, as the search strategy was set up with the assumption that researchers investigating student engagement (dimensions and indicators) would not necessarily label them as student engagement. When developing their definitions, authors in these 18 studies referenced 22 different sources, with the work of Kuh and colleagues e.g., (Hu & Kuh, 2002 ; Kuh, 2001 ; Kuh et al., 2006 ), as well as Astin ( 1984 ), the only authors referred to more than once. The most popular definition of student engagement within these studies was that of active participation and involvement in learning and university life e.g., (Bolden & Nahachewsky, 2015 ; bFukuzawa & Boyd, 2016 ), which was also found by Joksimović et al. ( 2018 ) in their review of MOOC research. Interaction, especially between peers and with faculty, was the next most prevalent definition e.g., (Andrew, Ewens, & Maslin-Prothero, 2015 ; Bigatel & Williams, 2015 ). Time and effort was given as a definition in four studies (Gleason, 2012 ; Hatzipanagos & Code, 2016 ; Price, Richardson, & Jelfs, 2007 ; Sun & Rueda, 2012 ), with expending physical and psychological energy (Ivala & Gachago, 2012 ) another definition. This variance in definitions and sources reflects the ongoing complexity of the construct (Zepke, 2018 ), and serves to reinforce the need for a clearer understanding across the field (Schindler et al., 2017 ).

Theoretical underpinnings

Reflecting findings from other systematic and literature reviews on the topic (Abdool, Nirula, Bonato, Rajji, & Silver, 2017 ; Hunsu et al., 2016 ; Kaliisa & Picard, 2017 ; Lundin et al., 2018 ), 59% ( n  = 100) of studies did not employ a theoretical model in their research. Of the 41% ( n  = 100) that did, 18 studies drew on social constructivism, followed by the Community of Inquiry model ( n  = 8), Sociocultural Learning Theory ( n  = 5), and Community of Practice models ( n  = 4). These findings also reflect the state of the field in general (Al-Sakkaf et al., 2019 ; Bond, 2019b ; Hennessy, Girvan, Mavrikis, Price, & Winters, 2018 ).

Another interesting finding of this research is that whilst 144 studies (59%) provided research questions, 99 studies (41%) did not. Although it is recognised that not all studies have research questions (Bryman, 2007 ), or only develop them throughout the research process, such as with grounded theory (Glaser & Strauss, 1967 ), a surprising number of quantitative studies (36%, n  = 37) did not have research questions. This is a reflection on the lack of theoretical guidance, as 30 of these 37 studies also did not draw on a theoretical or conceptual framework.

Question 2: which indicators of cognitive, behavioural and affective engagement were identified in studies where educational technology was used? Which indicators of student disengagement?

Student engagement indicators.

Within the corpus, the behavioural engagement dimension was documented in some form in 209 studies (86%), whereas the dimension of affective engagement was reported in 163 studies (67%) and the cognitive dimension in only 136 (56%) studies. However, the ten most often identified student engagement indicators across the studies overall (see Table  2 ) were evenly distributed over all three dimensions (see Table  3 ). The indicators participation/interaction/involvement , achievement and positive interactions with peers and teachers each appear in at least 100 studies, which is almost double the amount of the next most frequent student engagement indicator.

Across the 243 studies in the corpus, 117 (48%) showed all three dimensions of affective, cognitive and behavioural student engagement e.g., (Szabo & Schwartz, 2011 ), including six studies that used established student engagement questionnaires, such as the NSSE (e.g., Delialioglu, 2012 ), or self-developed addressing these three dimensions. Another 54 studies (22%) displayed at least two student engagement dimensions e.g., (Hatzipanagos & Code, 2016 ), including six questionnaire studies. Studies exhibiting one student engagement dimension only, was reported in 71 studies (29%) e.g., (Vural, 2013 ).

Student disengagement indicators

Indicators of student disengagement (see Table  4 ) were identified considerably less often across the corpus, which could be explained by the purpose of the studies being to primarily address/measure positive engagement, but on the other hand this could potentially be due to a form of self-selected or publication bias, due to less frequently reporting and/or publishing studies with negative results. The three disengagement indicators that were most often indicated were frustration ( n  = 33, 14%) e.g., (Ikpeze, 2007 ), opposition/rejection ( n  = 20, 8%) e.g., (Smidt, Bunk, McGrory, Li, & Gatenby, 2014 ) and disappointment e.g., (Granberg, 2010 ) , as well as other affective disengagement ( n  = 18, 7% each).

Technology tool typology and engagement/disengagement indicators

Across the 243 studies, a plethora of over 50 individual educational technology tools were employed. The top five most frequently researched tools were LMS ( n  = 89), discussion forums ( n  = 80), videos ( n  = 44), recorded lectures ( n  = 25), and chat ( n  = 24). Following a slightly modified version of Bower’s ( 2016 ) educational tools typology, 17 broad categories of tools were identified (see Additional file 4 for classification, and 3.2 for further information). The frequency with which tools from the respective groups employed in studies varied considerably (see Additional file 4 ), with the top five categories being text-based tools ( n  = 138), followed by knowledge organisation & sharing tools ( n  = 104), multimodal production tools ( n  = 89), assessment tools ( n  = 65) and website creation tools ( n  = 29).

Figure  5 shows what percentage of each engagement dimension (e.g., affective engagement or cognitive disengagement) was fostered through each specific technology type. Given the results in 4.2.1 on student engagement, it was somewhat unsurprising to see the prevalence of text-based tools , knowledge organisation & sharing tools, and multimodal production tools having the highest proportion of affective, behavioural and cognitive engagement. For example, affective engagement was identified in 163 studies, with 63% of these studies using text-based tools (e.g., Bulu & Yildirim, 2008 ) , and cognitive engagement identified in 136 studies, with 47% of those using knowledge organisation & sharing tools e.g., (Shonfeld & Ronen, 2015 ). However, further analysis of studies employing discussion forums (a text-based tool ) revealed that, whilst the top affective and behavioural engagement indicators were found in almost two-thirds of studies (see Additional file  5 ), there was a substantial gap between that and the next most prevalent engagement indicator, with the exact pattern (and indicators) emerging for wikis. This represents an area for future research.

figure 5

Engagement and disengagement by tool typology. Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning; A&H = Arts & Humanities; BA&L = Business, Administration and Law; EDU = Education; EM&C = Engineering, Manufacturing & Construction; H&W = Health & Welfare; ICT = Information & Communication Technologies; ID = interdisciplinary; NS,M&S = Natural Science, Mathematics & Statistics; NS = Not specified; SoS = Social Sciences, Journalism & Information

Interestingly, studies using website creation tools reported more disengagement than engagement indicators across all three domains (see Fig.  5 ), with studies using assessment tools and social networking tools also reporting increased instances of disengagement across two domains (affective and cognitive, and behavioural and cognitive respectively). 23 of the studies (79%) using website creation tools , used blogs, with students showing, for example, disinterest in topics chosen e.g., (Sullivan & Longnecker, 2014 ), anxiety over their lack of blogging knowledge and skills e.g., (Mansouri & Piki, 2016 ), and continued avoidance of using blogs in some cases, despite introductory training e.g., (Keiller & Inglis-Jassiem, 2015 ). In studies where assessment tools were used, students found timed assessment stressful, particularly when trying to complete complex mathematical solutions e.g., (Gupta, 2009 ), as well as quizzes given at the end of lectures, with some students preferring take-up time of content first e.g., (DePaolo & Wilkinson, 2014 ). Disengagement in studies where social networking tools were used, indicated that some students found it difficult to express themselves in short posts e.g., (Cook & Bissonnette, 2016 ), that conversations lacked authenticity e.g., (Arnold & Paulus, 2010 ), and that some did not want to mix personal and academic spaces e.g., (Ivala & Gachago, 2012 ).

Question 3: What are the learning scenarios, modes of delivery and educational technology tools employed in the studies?

Learning scenarios.

With 58.4% across the sample, social-collaborative learning (SCL) was the scenario most often employed ( n  = 142), followed by 43.2% of studies investigating self-directed learning (SDL) ( n  = 105) and 5.8% of studies using game-based learning (GBL) ( n  = 14) (see Fig. 6 ). Studies coded as SCL included those exploring social learning (Bandura, 1971 ) and social constructivist approaches (Vygotsky, 1978 ). Personal learning environments (PLE) were found for 2.9% of studies, 1.3% studies used other scenarios ( n  = 3), whereas another 13.2% did not provide specification of their learning scenarios ( n  = 32). It is noteworthy that in 45% of possible cases for employing SDL scenarios, SCL was also used. Other learning scenarios were also used mostly in combination with SCL and SDL. Given the rising number of higher education studies exploring flipped learning (Lundin et al., 2018 ), studies exploring the approach were also specifically coded (3%, n  = 7).

figure 6

Co-occurrence of learning scenarios across the sample ( n  = 243). Note. SDL = self-directed learning; SCL = social collaborative learning; GBL = game-based learning; PLE = personal learning environments; other = other learning scenario

Modes of delivery

In 84% of studies ( n  = 204), a single mode of delivery was used, with blended learning the most researched (109 studies), followed by distance education (72 studies), and face-to-face instruction (55 studies). Of the remaining 39 studies, 12 did not indicate their mode of delivery, whilst the other 27 studies combined or compared modes of delivery, e.g. comparing face to face courses to blended learning, such as the study on using iPads in undergraduate nursing education by Davies ( 2014 ).

Educational technology tools investigated

Most studies in this corpus (55%) used technology asynchronously, with 12% of studies researching synchronous tools, and 18% of studies using both asynchronous and synchronous. When exploring the use of tools, the results are not surprising, with a heavy reliance on asynchronous technology. However, when looking at tool usage with studies in face-to-face contexts, the number of synchronous tools (31%) is almost as many as the number of asynchronous tools (41%), and surprisingly low within studies in distance education (7%).

Tool categories were used in combination, with text-based tools most often used in combination with other technology types (see Fig.  7 ). For example, in 60% of all possible cases using multimodal production tools, in 69% of all possible synchronous production tool cases, in 72% of all possible knowledge, organisation & sharing tool cases , and a striking 89% of all possible learning software cases and 100% of all possible MOOC cases. On the contrary, text-based tools were never used in combination with games or data analysis tools . However, studies using gaming tools were used in 67% of possible assessment tool cases as well. Assessment tools, however, constitute somewhat of a special case when studies using website creation tools are concerned, with only 7% of possible cases having employed assessment tools .

figure 7

Co-occurrence of tools across the sample ( n  = 243). Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning

In order to gain further understanding into how educational technology was used, we examined how often a combination of two variables should occur in the sample and how often it actually occurs, with deviations described as either ‘more than’ or ‘less than’ the expected value. This provides further insight into potential gaps in the literature, which can inform future research. For example, an analysis of educational technology tool usage amongst study populations (see Fig.  8 ) reveals that 5.0 more studies than expected looked at knowledge organisation & sharing for graduate students, but 5.0 studies less than expected investigated assessment tools for this group. By contrast, 5 studies more than expected researched assessment tools for unspecified study levels, and 4.3 studies less than expected employed knowledge organisation & sharing for undergraduate students.

figure 8

Relative frequency of educational technology tools used according to study level Note. Abbreviations are explained in Fig. 7

Educational technology tools were also used differently from the expected pattern within various fields of study (see Fig.  9 ), most obviously for the cases of the top five tools. However, also for virtual worlds, found in 5.8 studies more in Health & Welfare than expected, and learning software, used in 6.4 studies more in Arts & Humanities than expected. In all other disciplines, learning software was used less often than assumed. Text-based tools were used more often than expected in fields of study that are already text-intensive, including Arts & Humanities, Education, Business, Administration & Law as well as Social Sciences - but less often than thought in fields such as Engineering, Health & Welfare, and Natural Sciences, Mathematics & Statistics. Multimodal production tools were used more often only in Health & Welfare, ICT and Natural Sciences, and less often than assumed across all other disciplines. Assessment tools deviated most clearly, with 11.9 studies more in Natural Sciences, Mathematics & Statistics than assumed, but with 5.2 studies less in both Education and Arts & Humanities.

figure 9

Relative frequency of educational technology tools used according to field of study. Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning

In regards to mode of delivery and educational technology tools used, it is interesting to see that from the five top tools, except for assessment tools , all tools were used in face-to-face instruction less often than expected (see Fig.  10 ); from 1.6 studies less for website creation tools to 14.5 studies less for knowledge organisation & sharing tools . Assessment tools , however, were used in 3.3 studies more than expected - but less often than assumed (although moderately) in blended learning and distance education formats. Text-based tools, multimodal production tools and knowledge organisation & sharing tools were employed more often than expected in blended and distance learning, especially obvious in 13.1 studies more on t ext-based tools and 8.2 studies on knowledge organisation & sharing tools in distance education. Contrary to what one would perhaps expect, social networking tools were used in 4.2 studies less than expected for this mode of delivery.

figure 10

Relative frequency of educational technology tools used according mode of delivery. Note. Tool abbreviations as per Figure 10. BL = Blended learning; DE = Distance education; F2F = Face-to-face; NS = Not stated

The findings of this study confirm those of previous research, with the most prolific countries being the US, UK, Australia, Taiwan and China. This is rather representative of the field, with an analysis of instructional design and technology research from 2007 to 2017 listing the most productive countries as the US, Taiwan, UK, Australia and Turkey (Bodily, Leary, & West, 2019 ). Likewise, an analysis of 40 years of research in Computers & Education (CAE) found that the US, UK and Taiwan accounted for 49.9% of all publications (Bond, 2018 ). By contrast, a lack of African research was apparent in this review, which is also evident in educational technology research in top tier peer-reviewed journals, with only 4% of articles published in the British Journal of Educational Technology ( BJET ) in the past decade (Bond, 2019b ) and 2% of articles in the Australasian Journal of Educational Technology (AJET) (Bond, 2018 ) hailing from Africa. Similar results were also found in previous literature and systematic reviews (see Table 1 ), which again raises questions of literature search and inclusion strategies, which will be further discussed in the limitations section.

Whilst other reviews of educational technology and student engagement have found studies to be largely STEM focused (Boyle et al., 2016 ; Li et al., 2017 ; Lundin et al., 2018 ; Nikou & Economides, 2018 ), this corpus features a more balanced scope of research, with the fields of Arts & Humanities (42 studies, 17.3%) and Education (42 studies, 17.3%) constituting roughly one third of all studies in the corpus - and Natural Sciences, Mathematics & Statistics, nevertheless, assuming rank 3 with 38 studies (15.6%). Beyond these three fields, further research is needed within underrepresented fields of study, in order to gain more comprehensive insights into the usage of educational technology tools (Kay & LeSage, 2009 ; Nikou & Economides, 2018 ).

Results of the systematic map further confirm the focus that prior educational technology research has placed on undergraduate students as the target group and participants in technology-enhanced learning settings e.g. (Cheston et al., 2013 ; Henrie et al., 2015 ). With the overwhelming number of 146 studies researching undergraduate students—compared to 33 studies on graduate students and 23 studies investigating both study levels—this also indicates that further investigation into the graduate student experience is needed. Furthermore, the fact that 41 studies do not report on the study level of their participants is an interesting albeit problematic fact, as implications might not easily be drawn for application to one’s own specific teaching context if the target group under investigation is not clearly denominated. A more precise reporting of participants’ details, as well as specification of the study context (country, institution, study level to name a few) is needed to transfer and apply study results to practice—being then able to take into account why some interventions succeed and others do not.

In line with other studies e.g. (Henrie et al., 2015 ), this review has also demonstrated that student engagement remains an under-theorised concept, that is often only considered fragmentally in research. Whilst studies in this review have often focused on isolated aspects of student engagement, their results are nevertheless interesting and valuable. However, it is important to relate these individual facets to the larger framework of student engagement, by considering how these aspects are connected and linked to each other. This is especially helpful to integrate research findings into practice, given that student engagement and disengagement are rarely one-dimensional; it is not enough to focus only on one aspect of engagement, but also to look at aspects that are adjacent to it (Pekrun & Linnenbrink-Garcia, 2012 ). It is also vital, therefore, that researchers develop and refine an understanding of student engagement, and make this explicit in their research (Appleton et al., 2008 ; Christenson et al., 2012 ).

Reflective of current conversations in the field of educational technology (Bond, 2019b ; Castañeda & Selwyn, 2018 ; Hew et al., 2019 ), as well as other reviews (Abdool et al., 2017 ; Hunsu et al., 2016 ; Kaliisa & Picard, 2017 ; Lundin et al., 2018 ), a substantial number of studies in this corpus did not have any theoretical underpinnings. Kaliisa and Picard ( 2017 ) argue that, without theory, research can result in disorganised accounts and issues with interpreting data, with research effectively “sit[ting] in a void if it’s not theoretically connected” (Kara, 2017 ), p. 56. Therefore, framing research in educational technology with a stronger theoretical basis, can assist with locating the “field’s disciplinary alignment” (Crook, 2019 ), p. 486 and further drive conversations forward.

The application of methods in this corpus was interesting in two ways. First, it is noticeable that quantitative studies are prevalent across the 243 articles in the sample. The number of studies employing qualitative research methods in the sample was comparatively low (56 studies as opposed to 84 mixed method studies and 103 quantitative studies). This is also reflected in the educational technology field at large, with a review of articles published in BJET and Educational Technology Research & Development (ETR&D) from 2002 to 2014 revealing that 40% of articles used quantitative methods, 26% qualitative and 13% mixed (Baydas, Kucuk, Yilmaz, Aydemir, & Goktas, 2015 ), and likewise a review of educational technology research from Turkey 1990–2011 revealed that 53% of articles used quantitative methods, 22% qualitative and 10% mixed methods (Kucuk, Aydemir, Yildirim, Arpacik, & Goktas, 2013 ). Quantitative studies primarily show that an intervention has worked or not when applied to e.g. a group of students in a certain setting as done in the study on using mobile apps on student performance in engineering education by Jou, Lin, and Tsai ( 2016 ), however, not all student engagement indicators can actually be measured in this way. The lower numbers of affective and cognitive engagement found in the studies in the corpus, reflect a wider call to the field to increase research on these two domains (Henrie et al., 2015 ; Joksimović et al., 2018 ; O’Flaherty & Phillips, 2015 ; Schindler et al., 2017 ). Whilst it is arguably more difficult to measure these two than behavioural engagement, the use of more rigorous and accurate surveys could be one possibility, as they can “capture unobservable aspects” (Henrie et al., 2015 ), p. 45 such as student feelings and information about the cognitive strategies they employ (Finn & Zimmer, 2012 ). However, they are often lengthy and onerous, or subject to the limitations of self-selection.

Whereas low numbers of qualitative studies researching student engagement and educational technology were previously identified in other student engagement and technology reviews (Connolly et al., 2012 ; Kay & LeSage, 2009 ; Lundin et al., 2018 ), it is studies like that by Lopera Medina ( 2014 ) in this sample, which reveal how people perceive this educational experience and the actual how of the process. Therefore, more qualitative and ethnographic measures should also be employed, such as student observations with thick descriptions, which can help shed light on the complexity of teaching and learning environments (Fredricks et al., 2004 ; Heflin, Shewmaker, & Nguyen, 2017 ). Conducting observations can be costly, however, both in time and money, so this is suggested in combination with computerised learning analytic data, which can provide measurable, objective and timely insight into how certain manifestations of engagement change over time (Henrie et al., 2015 ; Ma et al., 2015 ).

Whereas other results of this review have confirmed previous results in the field, the technology tools that were used in the studies and considered in their relation to student engagement in this corpus deviate. Whilst Henrie et al. ( 2015 ) found that the most frequently researched tools were discussion forums, general websites, LMS, general campus software and videos, the studies here focused predominantly on LMS, discussion forums, videos, recorded lectures and chat. Furthermore, whilst Schindler et al. ( 2017 ) found that digital games, web-conferencing software and Facebook were the most effective tools at enhancing student engagement, this review found that it was rather text-based tools , knowledge organisation & sharing , and multimodal production tools .

Limitations

During the execution of this systematic review, we tried to adhere to the method as rigorously as possible. However, several challenges were also encountered - some of which are addressed and discussed in another publication (Bedenlier, 2020b ) - resulting in limitations to this study. Four large, general educational research databases were searched, which are international in scope. However, by applying the criterion of articles published in English, research published on this topic in languages other than English was not included in this review. The same applies to research documented in, for example, grey literature, book chapters or monographs, or even articles from journals that are not indexed in the four databases searched. Another limitation is that only research published within the period 2007–2016 was investigated. Whilst we are cognisant of this being a restriction, we also think that the technological advances and the implications to be drawn from this time-frame relate more meaningfully to the current situation, than would have been the case for technologies used in the 1990s see (Bond, 2019b ). The sampling strategy also most likely accounts for the low number of studies from certain countries, e.g. in South America and Africa.

Studies included in this review represent various academic fields, and they also vary in the rigour with which they were conducted. Harden and Gough ( 2012 ) stress that the appraisal of quality and relevance of studies “ensure[s] that only the most appropriate, trustworthy and relevant studies are used to develop the conclusions of the review” (p. 154), we have included the criterion of being a peer reviewed contribution as a formal inclusion criterion from the beginning. In doing so, we reason that studies met a baseline of quality as applicable to published research in a specific field - otherwise they would not have been accepted for publication by the respective community. Finally, whilst the studies were diligently read and coded, and disagreements also discussed and reconciled, the human flaw of having overlooked or misinterpreted information provided in the individual articles cannot fully be excluded.

Finally, the results presented here provide an initial window into the overall body of research identified during the search, and further research is being undertaken to provide deeper insight into discipline specific use of technology and resulting student engagement using subsets of this sample (Bedenlier, 2020a ; Bond, M., Bedenlier, S., Buntins, K., Kerres, M., & Zawacki-Richter, O.: Facilitating student engagement through educational technology: A systematic review in the field of education, forthcoming).

Recommendations for future work and implications for practice

Whilst the evidence map presented in this article has confirmed previous research on the nexus of educational technology and student engagement, it has also elucidated a number of areas that further research is invited to address. Although these findings are similar to that of previous reviews, in order to more fully and comprehensively understand student engagement as a multi-faceted construct, it is not enough to focus only on indicators of engagement that can easily be measured, but rather the more complex endeavour of uncovering and investigating those indicators that reside below the surface. This also includes the careful alignment of theory and methodological design, in order to both adequately analyse the phenomenon under investigation, as well as contributing to the soundly executed body of research within the field of educational technology. Further research is invited in particular into how educational technology affects cognitive and affective engagement, whilst considering how this fits within the broader sociocultural framework of engagement (Bond, 2019a ). Further research is also invited into how educational technology affects student engagement within fields of study beyond Arts & Humanities, Education and Natural Sciences, Mathematics & Statistics, as well as within graduate level courses. The use of more qualitative research methods is particularly encouraged.

The findings of this review suggest that research gaps exist with particular combinations of tools, study levels and modes of delivery. With respect to study level, the use of assessment tools with graduate students, as well as knowledge organisation & sharing tools with undergraduate students, are topics researched far less than expected. The use of text-based tools in Engineering, Health & Welfare and Natural Sciences, Mathematics & Statistics, as well as the use of multimodal production tools outside of these disciplines, are also areas for future research, as is the use of assessment tools in the fields of Education and Arts & Humanities in particular.

With 109 studies in this systematic review using a blended learning design, this is a confirmation of the argument that online distance education and traditional face-to-face education are becoming increasingly more integrated with one another. Whilst this indicates that a lot of educators have made the move from face-to-face teaching to technology-enhanced learning, this also makes a case for the need for further professional development, in order to apply these tools effectively within their own teaching contexts, with this review indicating that further research is needed in particlar into the use of social networking tools in online/distance education. The question also needs to be asked, not only why the number of published studies are low within certain countries and regions, but also to enquire into the nature of why that is the case. This entails questioning the conditions under which research is being conducted, potentially criticising publication policies of major, Western-based journals, but also ultimately to reflect on one’s search strategy and research assumptions as a Western educator-researcher.

Based on the findings of this review, educators within higher education institutions are encouraged to use text-based tools , knowledge, organisation and sharing tools , and multimodal production tools in particular and, whilst any technology can lead to disengagement if not employed effectively, to be mindful that website creation tools (blogs and ePortfolios), social networking tools and assessment tools have been found to be more disengaging than engaging in this review. Therefore, educators are encouraged to ensure that students receive sufficient and ongoing training for any new technology used, including those that might appear straightforward, e.g. blogs, and that they may require extra writing support. Ensure that discussion/blog topics are interesting, that they allow student agency, and they are authentic to students, including the use of social media. Social networking tools that augment student professional learning networks are particularly useful. Educators should also be aware, however, that some students do not want to mix their academic and personal lives, and so the decision to use certain social platforms could be decided together with students.

Availability of data and materials

All data will be made publicly available, as part of the funding requirements, via https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn .

The detailed search strategy, including the modified search strings according to the individual databases, can be retrieved from https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn

The full code set can be retrieved from the review protocol at https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn .

Abdool, P. S., Nirula, L., Bonato, S., Rajji, T. K., & Silver, I. L. (2017). Simulation in undergraduate psychiatry: Exploring the depth of learner engagement. Academic Psychiatry : the Journal of the American Association of Directors of Psychiatric Residency Training and the Association for Academic Psychiatry , 41 (2), 251–261. https://doi.org/10.1007/s40596-016-0633-9 .

Article   Google Scholar  

Alioon, Y., & Delialioğlu, Ö. (2017). The effect of authentic m-learning activities on student engagement and motivation. British Journal of Educational Technology , 32 , 121. https://doi.org/10.1111/bjet.12559 .

Alrasheedi, M., Capretz, L. F., & Raza, A. (2015). A systematic review of the critical factors for success of mobile learning in higher education (university students’ perspective). Journal of Educational Computing Research , 52 (2), 257–276. https://doi.org/10.1177/0735633115571928 .

Al-Sakkaf, A., Omar, M., & Ahmad, M. (2019). A systematic literature review of student engagement in software visualization: A theoretical perspective. Computer Science Education , 29 (2–3), 283–309. https://doi.org/10.1080/08993408.2018.1564611 .

Andrew, L., Ewens, B., & Maslin-Prothero, S. (2015). Enhancing the online learning experience using virtual interactive classrooms. Australian Journal of Advanced Nursing , 32 (4), 22–31.

Google Scholar  

Antonenko, P. D. (2015). The instrumental value of conceptual frameworks in educational technology research. Educational Technology Research and Development , 63 (1), 53–71. https://doi.org/10.1007/s11423-014-9363-4 .

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools , 45 (5), 369–386. https://doi.org/10.1002/pits.20303 .

Arnold, N., & Paulus, T. (2010). Using a social networking site for experiential learning: Appropriating, lurking, modeling and community building. Internet and Higher Education , 13 (4), 188–196. https://doi.org/10.1016/j.iheduc.2010.04.002 .

Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Development , 25 (4), 297–308.

Astin, A. W. (1999). Student involvement: A developmental theory for higher education. Journal of College Student Development , 40 (5), 518–529. https://www.researchgate.net/publication/220017441 (Original work published July 1984).

Atmacasoy, A., & Aksu, M. (2018). Blended learning at pre-service teacher education in Turkey: A systematic review. Education and Information Technologies , 23 (6), 2399–2422. https://doi.org/10.1007/s10639-018-9723-5 .

Azevedo, R. (2015). Defining and measuring engagement and learning in science: Conceptual, theoretical, methodological, and analytical issues. Educational Psychologist , 50 (1), 84–94. https://doi.org/10.1080/00461520.2015.1004069 .

Bandura, A. (1971). Social learning theory . New York: General Learning Press.

Barak, M. (2018). Are digital natives open to change? Examining flexible thinking and resistance to change. Computers & Education , 121 , 115–123. https://doi.org/10.1016/j.compedu.2018.01.016 .

Barak, M., & Levenberg, A. (2016). Flexible thinking in learning: An individual differences measure for learning in technology-enhanced environments. Computers & Education , 99 , 39–52. https://doi.org/10.1016/j.compedu.2016.04.003 .

Baron, P., & Corbin, L. (2012). Student engagement: Rhetoric and reality. Higher Education Research and Development , 31 (6), 759–772. https://doi.org/10.1080/07294360.2012.655711 .

Baydas, O., Kucuk, S., Yilmaz, R. M., Aydemir, M., & Goktas, Y. (2015). Educational technology research trends from 2002 to 2014. Scientometrics , 105 (1), 709–725. https://doi.org/10.1007/s11192-015-1693-4 .

Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020a). Facilitating student engagement through educational technology in higher education: A systematic review in the field of arts & humanities. Australasian Journal of Educational Technology , 36 (4), 27–47. https://doi.org/10.14742/ajet.5477 .

Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020b). Learning by Doing? Reflections on Conducting a Systematic Review in the Field of Educational Technology. In O. Zawacki-Richter, M. Kerres, S. Bedenlier, M. Bond, & K. Buntins (Eds.), Systematic Reviews in Educational Research (Vol. 45 , pp. 111–127). Wiesbaden: Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-27602-7_7 .

Ben-Eliyahu, A., Moore, D., Dorph, R., & Schunn, C. D. (2018). Investigating the multidimensionality of engagement: Affective, behavioral, and cognitive engagement across science activities and contexts. Contemporary Educational Psychology , 53 , 87–105. https://doi.org/10.1016/j.cedpsych.2018.01.002 .

Betihavas, V., Bridgman, H., Kornhaber, R., & Cross, M. (2016). The evidence for ‘flipping out’: A systematic review of the flipped classroom in nursing education. Nurse Education Today , 38 , 15–21. https://doi.org/10.1016/j.nedt.2015.12.010 .

Bigatel, P., & Williams, V. (2015). Measuring student engagement in an online program. Online Journal of Distance Learning Administration , 18 (2), 9.

Bodily, R., Leary, H., & West, R. E. (2019). Research trends in instructional design and technology journals. British Journal of Educational Technology , 50 (1), 64–79. https://doi.org/10.1111/bjet.12712 .

Boekaerts, M. (2016). Engagement as an inherent aspect of the learning process. Learning and Instruction , 43 , 76–83. https://doi.org/10.1016/j.learninstruc.2016.02.001 .

Bolden, B., & Nahachewsky, J. (2015). Podcast creation as transformative music engagement. Music Education Research , 17 (1), 17–33. https://doi.org/10.1080/14613808.2014.969219 .

Bond, M. (2018). Helping doctoral students crack the publication code: An evaluation and content analysis of the Australasian Journal of Educational Technology. Australasian Journal of Educational Technology , 34 (5), 168–183. https://doi.org/10.14742/ajet.4363 .

Bond, M., & Bedenlier, S. (2019a). Facilitating Student Engagement Through Educational Technology: Towards a Conceptual Framework. Journal of Interactive Media in Education , 2019 (1), 1-14. https://doi.org/10.5334/jime.528 .

Bond, M., Zawacki-Richter, O., & Nichols, M. (2019b). Revisiting five decades of educational technology research: A content and authorship analysis of the British Journal of Educational Technology. British Journal of Educational Technology , 50 (1), 12–63. https://doi.org/10.1111/bjet.12730 .

Bouta, H., Retalis, S., & Paraskeva, F. (2012). Utilising a collaborative macro-script to enhance student engagement: A mixed method study in a 3D virtual environment. Computers & Education , 58 (1), 501–517. https://doi.org/10.1016/j.compedu.2011.08.031 .

Bower, M. (2015). A typology of web 2.0 learning technologies . EDUCAUSE Digital Library Retrieved 20 June 2019, from http://www.educause.edu/library/resources/typology-web-20-learning-technologies .

Bower, M. (2016). Deriving a typology of web 2.0 learning technologies. British Journal of Educational Technology , 47 (4), 763–777. https://doi.org/10.1111/bjet.12344 .

Boyle, E. A., Connolly, T. M., Hainey, T., & Boyle, J. M. (2012). Engagement in digital entertainment games: A systematic review. Computers in Human Behavior , 28 (3), 771–780. https://doi.org/10.1016/j.chb.2011.11.020 .

Boyle, E. A., Hainey, T., Connolly, T. M., Gray, G., Earp, J., Ott, M., … Pereira, J. (2016). An update to the systematic literature review of empirical evidence of the impacts and outcomes of computer games and serious games. Computers & Education , 94 , 178–192. https://doi.org/10.1016/j.compedu.2015.11.003 .

Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. The Internet and Higher Education , 27 , 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007 .

Brunton, G., Stansfield, C., & Thomas, J. (2012). Finding relevant studies. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews , (pp. 107–134). Los Angeles: Sage.

Bryman, A. (2007). The research question in social research: What is its role? International Journal of Social Research Methodology , 10 (1), 5–20. https://doi.org/10.1080/13645570600655282 .

Bulu, S. T., & Yildirim, Z. (2008). Communication behaviors and trust in collaborative online teams. Educational Technology & Society , 11 (1), 132–147.

Bundick, M., Quaglia, R., Corso, M., & Haywood, D. (2014). Promoting student engagement in the classroom. Teachers College Record , 116 (4) Retrieved from http://www.tcrecord.org/content.asp?contentid=17402 .

Castañeda, L., & Selwyn, N. (2018). More than tools? Making sense of the ongoing digitizations of higher education. International Journal of Educational Technology in Higher Education , 15 (1), 211. https://doi.org/10.1186/s41239-018-0109-y .

Chen, P.-S. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education , 54 (4), 1222–1232. https://doi.org/10.1016/j.compedu.2009.11.008 .

Cheston, C. C., Flickinger, T. E., & Chisolm, M. S. (2013). Social media use in medical education: A systematic review. Academic Medicine : Journal of the Association of American Medical Colleges , 88 (6), 893–901. https://doi.org/10.1097/ACM.0b013e31828ffc23 .

Choi, M., Glassman, M., & Cristol, D. (2017). What it means to be a citizen in the internet age: Development of a reliable and valid digital citizenship scale. Computers & Education , 107 , 100–112. https://doi.org/10.1016/j.compedu.2017.01.002 .

Christenson, S. L., Reschly, A. L., & Wylie, C. (Eds.) (2012). Handbook of research on student engagement . Boston: Springer US.

Coates, H. (2007). A model of online and general campus-based student engagement. Assessment & Evaluation in Higher Education , 32 (2), 121–141. https://doi.org/10.1080/02602930600801878 .

Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education , 59 (2), 661–686. https://doi.org/10.1016/j.compedu.2012.03.004 .

Cook, M. P., & Bissonnette, J. D. (2016). Developing preservice teachers’ positionalities in 140 characters or less: Examining microblogging as dialogic space. Contemporary Issues in Technology and Teacher Education (CITE Journal) , 16 (2), 82–109.

Crompton, H., Burke, D., Gregory, K. H., & Gräbe, C. (2016). The use of mobile learning in science: A systematic review. Journal of Science Education and Technology , 25 (2), 149–160. https://doi.org/10.1007/s10956-015-9597-x .

Crook, C. (2019). The “British” voice of educational technology research: 50th birthday reflection. British Journal of Educational Technology , 50 (2), 485–489. https://doi.org/10.1111/bjet.12757 .

Davies, M. (2014). Using the apple iPad to facilitate student-led group work and seminar presentation. Nurse Education in Practice , 14 (4), 363–367. https://doi.org/10.1016/j.nepr.2014.01.006 .

Article   MathSciNet   Google Scholar  

Delialioglu, O. (2012). Student engagement in blended learning environments with lecture-based and problem-based instructional approaches. Educational Technology & Society , 15 (3), 310–322.

DePaolo, C. A., & Wilkinson, K. (2014). Recurrent online quizzes: Ubiquitous tools for promoting student presence, participation and performance. Interdisciplinary Journal of E-Learning and Learning Objects , 10 , 75–91 Retrieved from http://www.ijello.org/Volume10/IJELLOv10p075-091DePaolo0900.pdf .

Doherty, K., & Doherty, G. (2018). Engagement in HCI. ACM Computing Surveys , 51 (5), 1–39. https://doi.org/10.1145/3234149 .

Eccles, J. (2016). Engagement: Where to next? Learning and Instruction , 43 , 71–75. https://doi.org/10.1016/j.learninstruc.2016.02.003 .

Eccles, J., & Wang, M.-T. (2012). Part I commentary: So what is student engagement anyway? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 133–145). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_6 .

Chapter   Google Scholar  

Englund, C., Olofsson, A. D., & Price, L. (2017). Teaching with technology in higher education: Understanding conceptual change and development in practice. Higher Education Research and Development , 36 (1), 73–87. https://doi.org/10.1080/07294360.2016.1171300 .

Fabian, K., Topping, K. J., & Barron, I. G. (2016). Mobile technology and mathematics: Effects on students’ attitudes, engagement, and achievement. Journal of Computers in Education , 3 (1), 77–104. https://doi.org/10.1007/s40692-015-0048-8 .

Filsecker, M., & Kerres, M. (2014). Engagement as a volitional construct. Simulation & Gaming , 45 (4–5), 450–470. https://doi.org/10.1177/1046878114553569 .

Finn, J. (2006). The adult lives of at-risk students: The roles of attainment and engagement in high school (NCES 2006-328) . Washington, DC: U.S. Department of Education, National Center for Education Statistics Retrieved from website: https://nces.ed.gov/pubs2006/2006328.pdf .

Finn, J., & Zimmer, K. (2012). Student engagement: What is it? Why does it matter? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 97–131). Boston: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_5 .

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research , 74 (1), 59–109. https://doi.org/10.3102/00346543074001059 .

Fredricks, J. A., Filsecker, M., & Lawson, M. A. (2016). Student engagement, context, and adjustment: Addressing definitional, measurement, and methodological issues. Learning and Instruction , 43 , 1–4. https://doi.org/10.1016/j.learninstruc.2016.02.002 .

Fredricks, J. A., Wang, M.-T., Schall Linn, J., Hofkens, T. L., Sung, H., Parr, A., & Allerton, J. (2016). Using qualitative methods to develop a survey measure of math and science engagement. Learning and Instruction , 43 , 5–15. https://doi.org/10.1016/j.learninstruc.2016.01.009 .

Fukuzawa, S., & Boyd, C. (2016). Student engagement in a large classroom: Using technology to generate a hybridized problem-based learning experience in a large first year undergraduate class. Canadian Journal for the Scholarship of Teaching and Learning , 7 (1). https://doi.org/10.5206/cjsotl-rcacea.2016.1.7 .

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research . Chicago: Aldine.

Gleason, J. (2012). Using technology-assisted instruction and assessment to reduce the effect of class size on student outcomes in undergraduate mathematics courses. College Teaching , 60 (3), 87–94.

Gough, D., Oliver, S., & Thomas, J. (2012). An introduction to systematic reviews . Los Angeles: Sage.

Granberg, C. (2010). Social software for reflective dialogue: Questions about reflection and dialogue in student Teachers’ blogs. Technology, Pedagogy and Education , 19 (3), 345–360. https://doi.org/10.1080/1475939X.2010.513766 .

Greenwood, L., & Kelly, C. (2019). A systematic literature review to explore how staff in schools describe how a sense of belonging is created for their pupils. Emotional and Behavioural Difficulties , 24 (1), 3–19. https://doi.org/10.1080/13632752.2018.1511113 .

Gupta, M. L. (2009). Using emerging technologies to promote student engagement and learning in agricultural mathematics. International Journal of Learning , 16 (10), 497–508. https://doi.org/10.18848/1447-9494/CGP/v16i10/46658 .

Harden, A., & Gough, D. (2012). Quality and relevance appraisal. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews , (pp. 153–178). London: Sage.

Hatzipanagos, S., & Code, J. (2016). Open badges in online learning environments: Peer feedback and formative assessment as an engagement intervention for promoting agency. Journal of Educational Multimedia and Hypermedia , 25 (2), 127–142.

Heflin, H., Shewmaker, J., & Nguyen, J. (2017). Impact of mobile technology on student attitudes, engagement, and learning. Computers & Education , 107 , 91–99. https://doi.org/10.1016/j.compedu.2017.01.006 .

Henderson, M., Selwyn, N., & Aston, R. (2017). What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Studies in Higher Education , 42 (8), 1567–1579. https://doi.org/10.1080/03075079.2015.1007946 .

Hennessy, S., Girvan, C., Mavrikis, M., Price, S., & Winters, N. (2018). Editorial. British Journal of Educational Technology , 49 (1), 3–5. https://doi.org/10.1111/bjet.12598 .

Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education , 90 , 36–53. https://doi.org/10.1016/j.compedu.2015.09.005 .

Hew, K. F., & Cheung, W. S. (2013). Use of web 2.0 technologies in K-12 and higher education: The search for evidence-based practice. Educational Research Review , 9 , 47–64. https://doi.org/10.1016/j.edurev.2012.08.001 .

Hew, K. F., Lan, M., Tang, Y., Jia, C., & Lo, C. K. (2019). Where is the “theory” within the field of educational technology research? British Journal of Educational Technology , 50 (3), 956–971. https://doi.org/10.1111/bjet.12770 .

Howard, S. K., Ma, J., & Yang, J. (2016). Student rules: Exploring patterns of students’ computer-efficacy and engagement with digital technologies in learning. Computers & Education , 101 , 29–42. https://doi.org/10.1016/j.compedu.2016.05.008 .

Hu, S., & Kuh, G. D. (2002). Being (dis)engaged in educationally purposeful activities: The influences of student and institutional characteristics. Research in Higher Education , 43 (5), 555–575. https://doi.org/10.1023/A:1020114231387 .

Hunsu, N. J., Adesope, O., & Bayly, D. J. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Computers & Education , 94 , 102–119. https://doi.org/10.1016/j.compedu.2015.11.013 .

Ikpeze, C. (2007). Small group collaboration in peer-led electronic discourse: An analysis of group dynamics and interactions involving Preservice and Inservice teachers. Journal of Technology and Teacher Education , 15 (3), 383–407.

Ivala, E., & Gachago, D. (2012). Social media for enhancing student engagement: The use of Facebook and blogs at a university of technology. South African Journal of Higher Education , 26 (1), 152–167.

Järvelä, S., Järvenoja, H., Malmberg, J., Isohätälä, J., & Sobocinski, M. (2016). How do types of interaction and phases of self-regulated learning set a stage for collaborative engagement? Learning and Instruction , 43 , 39–51. https://doi.org/10.1016/j.learninstruc.2016.01.005 .

Joksimović, S., Poquet, O., Kovanović, V., Dowell, N., Mills, C., Gašević, D., … Brooks, C. (2018). How do we model learning at scale? A systematic review of research on MOOCs. Review of Educational Research , 88 (1), 43–86. https://doi.org/10.3102/0034654317740335 .

Jou, M., Lin, Y.-T., & Tsai, H.-C. (2016). Mobile APP for motivation to learning: An engineering case. Interactive Learning Environments , 24 (8), 2048–2057. https://doi.org/10.1080/10494820.2015.1075136 .

Junco, R. (2012). The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement. Computers & Education , 58 (1), 162–171. https://doi.org/10.1016/j.compedu.2011.08.004 .

Kahn, P. (2014). Theorising student engagement in higher education. British Educational Research Journal , 40 (6), 1005–1018. https://doi.org/10.1002/berj.3121 .

Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education , 38 (5), 758–773. https://doi.org/10.1080/03075079.2011.598505 .

Kahu, E. R., & Nelson, K. (2018). Student engagement in the educational interface: Understanding the mechanisms of student success. Higher Education Research and Development , 37 (1), 58–71. https://doi.org/10.1080/07294360.2017.1344197 .

Kaliisa, R., & Picard, M. (2017). A systematic review on mobile learning in higher education: The African perspective. The Turkish Online Journal of Educational Technology , 16 (1) Retrieved from https://files.eric.ed.gov/fulltext/EJ1124918.pdf .

Kara, H. (2017). Research and evaluation for busy students and practitioners: A time-saving guide , (2nd ed., ). Bristol: Policy Press.

Book   Google Scholar  

Karabulut-Ilgu, A., Jaramillo Cherrez, N., & Jahren, C. T. (2018). A systematic review of research on the flipped learning method in engineering education: Flipped learning in engineering education. British Journal of Educational Technology , 49 (3), 398–411. https://doi.org/10.1111/bjet.12548 .

Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education , 53 (3), 819–827. https://doi.org/10.1016/j.compedu.2009.05.001 .

Keiller, L., & Inglis-Jassiem, G. (2015). A lesson in listening: Is the student voice heard in the rush to incorporate technology into health professions education? African Journal of Health Professions Education , 7 (1), 47–50. https://doi.org/10.7196/ajhpe.371 .

Kelley, K., Lai, K., Lai, M. K., & Suggests, M. (2018). Package ‘MBESS’. Retrieved from https://cran.r-project.org/web/packages/MBESS/MBESS.pdf

Kerres, M. (2013). Mediendidaktik. Konzeption und Entwicklung mediengestützter Lernangebote . München: Oldenbourg.

Kirkwood, A. (2009). E-learning: You don’t always get what you hope for. Technology, Pedagogy and Education , 18 (2), 107–121. https://doi.org/10.1080/14759390902992576 .

Koehler, M., & Mishra, P. (2005). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of Educational Computing Research , 32 (2), 131–152.

Krause, K.-L., & Coates, H. (2008). Students’ engagement in first-year university. Assessment & Evaluation in Higher Education , 33 (5), 493–505. https://doi.org/10.1080/02602930701698892 .

Kucuk, S., Aydemir, M., Yildirim, G., Arpacik, O., & Goktas, Y. (2013). Educational technology research trends in Turkey from 1990 to 2011. Computers & Education , 68 , 42–50. https://doi.org/10.1016/j.compedu.2013.04.016 .

Kuh, G. D. (2001). The National Survey of student engagement: Conceptual framework and overview of psychometric properties . Bloomington: Indiana University Center for Postsecondary Research Retrieved from http://nsse.indiana.edu/2004_annual_report/pdf/2004_conceptual_framework.pdf .

Kuh, G. D. (2009). What student affairs professionals need to know about student engagement. Journal of College Student Development , 50 (6), 683–706. https://doi.org/10.1353/csd.0.0099 .

Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education , 79 (5), 540–563 Retrieved from http://www.jstor.org.ezproxy.umuc.edu/stable/25144692 .

Kuh, G. D., J. Kinzie, J. A. Buckley, B. K. Bridges, & J. C. Hayek. (2006). What matters to student success: A review of the literature. Washington, DC: National Postsecondary Education Cooperative.

Kupper, L. L., & Hafner, K. B. (1989). How appropriate are popular sample size formulas? The American Statistician , 43 (2), 101–105.

Lai, J. W. M., & Bower, M. (2019). How is the use of technology in education evaluated? A systematic review. Computers & Education , 133 , 27–42. https://doi.org/10.1016/j.compedu.2019.01.010 .

Lawson, M. A., & Lawson, H. A. (2013). New conceptual frameworks for student engagement research, policy, and practice. Review of Educational Research , 83 (3), 432–479. https://doi.org/10.3102/0034654313480891 .

Leach, L., & Zepke, N. (2011). Engaging students in learning: A review of a conceptual organiser. Higher Education Research and Development , 30 (2), 193–204. https://doi.org/10.1080/07294360.2010.509761 .

Li, J., van der Spek, E. D., Feijs, L., Wang, F., & Hu, J. (2017). Augmented reality games for learning: A literature review. In N. Streitz, & P. Markopoulos (Eds.), Lecture Notes in Computer Science. Distributed, Ambient and Pervasive Interactions , (vol. 10291, pp. 612–626). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-58697-7_46 .

Lim, C. (2004). Engaging learners in online learning environments. TechTrends , 48 (4), 16–23 Retrieved from https://link.springer.com/content/pdf/10.1007%2FBF02763440.pdf .

Lopera Medina, S. (2014). Motivation conditions in a foreign language reading comprehension course offering both a web-based modality and a face-to-face modality (Las condiciones de motivación en un curso de comprensión de lectura en lengua extranjera (LE) ofrecido tanto en la modalidad presencial como en la modalidad a distancia en la web). PROFILE: Issues in Teachers’ Professional Development , 16 (1), 89–104 Retrieved from https://search.proquest.com/docview/1697487398?accountid=12968 .

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: A systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1), 1. https://doi.org/10.1186/s41239-018-0101-6 .

Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. The Internet and Higher Education , 24 , 26–34. https://doi.org/10.1016/j.iheduc.2014.09.005 .

Mahatmya, D., Lohman, B. J., Matjasko, J. L., & Farb, A. F. (2012). Engagement across developmental periods. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 45–63). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_3 .

Mansouri, A. S., & Piki, A. (2016). An exploration into the impact of blogs on students’ learning: Case studies in postgraduate business education. Innovations in Education and Teaching International , 53 (3), 260–273. https://doi.org/10.1080/14703297.2014.997777 .

Martin, A. J. (2012). Motivation and engagement: Conceptual, operational, and empirical clarity. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 303–311). Boston: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_14 .

McCutcheon, K., Lohan, M., Traynor, M., & Martin, D. (2015). A systematic review evaluating the impact of online or blended learning vs. face-to-face learning of clinical skills in undergraduate nurse education. Journal of Advanced Nursing , 71 (2), 255–270. https://doi.org/10.1111/jan.12509 .

Miake-Lye, I. M., Hempel, S., Shanman, R., & Shekelle, P. G. (2016). What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products. Systematic Reviews , 5 , 28. https://doi.org/10.1186/s13643-016-0204-x .

Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ (Clinical Research Ed.) , 339 , b2535. https://doi.org/10.1136/bmj.b2535 .

Nelson Laird, T. F., & Kuh, G. D. (2005). Student experiences with information technology and their relationship to other aspects of student engagement. Research in Higher Education , 46 (2), 211–233. https://doi.org/10.1007/s11162-004-1600-y .

Nguyen, L., Barton, S. M., & Nguyen, L. T. (2015). iPads in higher education-hype and hope. British Journal of Educational Technology , 46 (1), 190–203. https://doi.org/10.1111/bjet.12137 .

Nicholas, D., Watkinson, A., Jamali, H. R., Herman, E., Tenopir, C., Volentine, R., … Levine, K. (2015). Peer review: Still king in the digital age. Learned Publishing , 28 (1), 15–21. https://doi.org/10.1087/20150104 .

Nikou, S. A., & Economides, A. A. (2018). Mobile-based assessment: A literature review of publications in major referred journals from 2009 to 2018. Computers & Education , 125 , 101–119. https://doi.org/10.1016/j.compedu.2018.06.006 .

Norris, L., & Coutas, P. (2014). Cinderella’s coach or just another pumpkin? Information communication technologies and the continuing marginalisation of languages in Australian schools. Australian Review of Applied Linguistics , 37 (1), 43–61 Retrieved from http://www.jbe-platform.com/content/journals/10.1075/aral.37.1.03nor .

OECD (2015a). Schooling redesigned. Educational Research and Innovation . OECD Publishing Retrieved from http://www.oecd-ilibrary.org/education/schooling-redesigned_9789264245914-en .

OECD (2015b). Students, computers and learning . PISA: OECD Publishing Retrieved from http://www.oecd-ilibrary.org/education/students-computers-and-learning_9789264239555-en .

O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95. https://doi.org/10.1016/j.iheduc.2015.02.002 .

O’Gorman, E., Salmon, N., & Murphy, C.-A. (2016). Schools as sanctuaries: A systematic review of contextual factors which contribute to student retention in alternative education. International Journal of Inclusive Education , 20 (5), 536–551. https://doi.org/10.1080/13603116.2015.1095251 .

Oliver, B., & de St Jorre, Trina, J. (2018). Graduate attributes for 2020 and beyond: recommendations for Australian higher education providers. Higher Education Research and Development , 1–16. https://doi.org/10.1080/07294360.2018.1446415 .

O’Mara-Eves, A., Brunton, G., McDaid, D., Kavanagh, J., Oliver, S., & Thomas, J. (2014). Techniques for identifying cross-disciplinary and ‘hard-to-detect’ evidence for systematic review. Research Synthesis Methods , 5 (1), 50–59. https://doi.org/10.1002/jrsm.1094 .

Payne, L. (2017). Student engagement: Three models for its investigation. Journal of Further and Higher Education , 3 (2), 1–17. https://doi.org/10.1080/0309877X.2017.1391186 .

Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 259–282). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_12 .

Popenici, S. (2013). Towards a new vision for university governance, pedagogies and student engagement. In E. Dunne, & D. Owen (Eds.), The student engagement handbook: Practice in higher education , (1st ed., pp. 23–42). Bingley: Emerald.

Price, L., Richardson, J. T., & Jelfs, A. (2007). Face-to-face versus online tutoring support in distance education. Studies in Higher Education , 32 (1), 1–20.

Quin, D. (2017). Longitudinal and contextual associations between teacher–student relationships and student engagement. Review of Educational Research , 87 (2), 345–387. https://doi.org/10.3102/0034654316669434 .

Rashid, T., & Asghar, H. M. (2016). Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Computers in Human Behavior , 63 , 604–612. https://doi.org/10.1016/j.chb.2016.05.084 .

Redecker, C. (2017). European framework for the digital competence of educators . Luxembourg: Office of the European Union.

Redmond, P., Heffernan, A., Abawi, L., Brown, A., & Henderson, R. (2018). An online engagement framework for higher education. Online Learning , 22 (1). https://doi.org/10.24059/olj.v22i1.1175 .

Reeve, J. (2012). A self-determination theory perspective on student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 149–172). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_7 .

Reeve, J., & Tseng, C.-M. (2011). Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology , 36 (4), 257–267. https://doi.org/10.1016/j.cedpsych.2011.05.002 .

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 3–19). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_1 .

Salaber, J. (2014). Facilitating student engagement and collaboration in a large postgraduate course using wiki-based activities. The International Journal of Management Education , 12 (2), 115–126. https://doi.org/10.1016/j.ijme.2014.03.006 .

Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: A critical review of the literature. International Journal of Educational Technology in Higher Education , 14 (1), 253. https://doi.org/10.1186/s41239-017-0063-0 .

Selwyn, N. (2016). Digital downsides: Exploring university students’ negative engagements with digital technology. Teaching in Higher Education , 21 (8), 1006–1021. https://doi.org/10.1080/13562517.2016.1213229 .

Shonfeld, M., & Ronen, I. (2015). Online learning for students from diverse backgrounds: Learning disability students, excellent students and average students. IAFOR Journal of Education , 3 (2), 13–29.

Skinner, E., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 21–44). Boston: Springer US.

Smidt, E., Bunk, J., McGrory, B., Li, R., & Gatenby, T. (2014). Student attitudes about distance education: Focusing on context and effective practices. IAFOR Journal of Education , 2 (1), 40–64.

Smith, R. (2006). Peer review: A flawed process at the heart of science and journals. Journal of the Royal Society of Medicine , 99 , 178–182.

Smith, T., & Lambert, R. (2014). A systematic review investigating the use of twitter and Facebook in university-based healthcare education. Health Education , 114 (5), 347–366. https://doi.org/10.1108/HE-07-2013-0030 .

Solomonides, I. (2013). A relational and multidimensional model of student engagement. In E. Dunne, & D. Owen (Eds.), The student engagement handbook: Practice in higher education , (1st ed., pp. 43–58). Bingley: Emerald.

Sosa Neira, E. A., Salinas, J., & de Benito, B. (2017). Emerging technologies (ETs) in education: A systematic review of the literature published between 2006 and 2016. International Journal of Emerging Technologies in Learning (IJET) , 12 (05), 128. https://doi.org/10.3991/ijet.v12i05.6939 .

Sullivan, M., & Longnecker, N. (2014). Class blogs as a teaching tool to promote writing and student interaction. Australasian Journal of Educational Technology , 30 (4), 390–401. https://doi.org/10.14742/ajet.322 .

Sun, J. C.-Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self-regulation: Their impact on student engagement in distance education. British Journal of Educational Technology , 43 (2), 191–204. https://doi.org/10.1111/j.1467-8535.2010.01157.x .

Szabo, Z., & Schwartz, J. (2011). Learning methods for teacher education: The use of online discussions to improve critical thinking. Technology, Pedagogy and Education , 20 (1), 79–94. https://doi.org/10.1080/1475939x.2010.534866 .

Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research , 81 (1), 4–28. https://doi.org/10.3102/0034654310393361 .

Trowler, V. (2010). Student engagement literature review . York: The Higher Education Academy Retrieved from website: https://www.heacademy.ac.uk/system/files/studentengagementliteraturereview_1.pdf .

Van Rooij, E., Brouwer, J., Fokkens-Bruinsma, M., Jansen, E., Donche, V., & Noyens, D. (2017). A systematic review of factors related to first-year students’ success in Dutch and Flemish higher education. Pedagogische Studien , 94 (5), 360–405 Retrieved from https://repository.uantwerpen.be/docman/irua/cebc4c/149722.pdf .

Vural, O. F. (2013). The impact of a question-embedded video-based learning tool on E-learning. Educational Sciences: Theory and Practice , 13 (2), 1315–1323.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Cambridge: Harvard University Press.

Webb, L., Clough, J., O’Reilly, D., Wilmott, D., & Witham, G. (2017). The utility and impact of information communication technology (ICT) for pre-registration nurse education: A narrative synthesis systematic review. Nurse Education Today , 48 , 160–171. https://doi.org/10.1016/j.nedt.2016.10.007 .

Wekullo, C. S. (2019). International undergraduate student engagement: Implications for higher education administrators. Journal of International Students , 9 (1), 320–337. https://doi.org/10.32674/jis.v9i1.257 .

Wimpenny, K., & Savin-Baden, M. (2013). Alienation, agency and authenticity: A synthesis of the literature on student engagement. Teaching in Higher Education , 18 (3), 311–326. https://doi.org/10.1080/13562517.2012.725223 .

Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting learners’ agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educational Psychologist , 52 (1), 17–37. https://doi.org/10.1080/00461520.2016.1207538 .

Zepke, N. (2014). Student engagement research in higher education: Questioning an academic orthodoxy. Teaching in Higher Education , 19 (6), 697–708. https://doi.org/10.1080/13562517.2014.901956 .

Zepke, N. (2018). Student engagement in neo-liberal times: What is missing? Higher Education Research and Development , 37 (2), 433–446. https://doi.org/10.1080/07294360.2017.1370440 .

Zepke, N., & Leach, L. (2010). Improving student engagement: Ten proposals for action. Active Learning in Higher Education , 11 (3), 167–177. https://doi.org/10.1177/1469787410379680 .

Zhang, A., & Aasheim, C. (2011). Academic success factors: An IT student perspective. Journal of Information Technology Education: Research , 10 , 309–331. https://doi.org/10.28945/1518 .

Download references

Acknowledgements

The authors thank the two student assistants who helped during the article retrieval and screening stage.

This research resulted from the ActiveLearn project, funded by the Bundesministerium für Bildung und Forschung (BMBF-German Ministry of Education and Research) [grant number 16DHL1007].

Author information

Authors and affiliations.

Faculty of Education and Social Sciences (COER), Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany

Melissa Bond, Svenja Bedenlier & Olaf Zawacki-Richter

Learning Lab, Universität Duisburg-Essen, Essen, Germany

Katja Buntins & Michael Kerres

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the design and conceptualisation of the systematic review. MB, KB and SB conducted the systematic review search and data extraction. MB undertook the literature review on student engagement and educational technology, co-wrote the method, results, discussion and conclusion section. KB designed and executed the sampling strategy and produced all of the graphs and tables, as well as assisted with the formulation of the article. SB co-wrote the method, results, discussion and conclusion sections, and proof read the introduction and literature review sections. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Melissa Bond .

Ethics declarations

Consent for publication.

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

Literature reviews (LR) and systematic reviews (SR) on student engagement

Additional file 2.

Indicators of engagement and disengagement

Additional file 3.

Literature reviews (LR) and systematic reviews (SR) on student engagement and technology in higher education (HE)

Additional file 4.

Educational technology tool typology based on Bower ( 2016 ) and Educational technology tools used

Additional file 5.

Text-based tool examples by engagement domain

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Bond, M., Buntins, K., Bedenlier, S. et al. Mapping research in student engagement and educational technology in higher education: a systematic evidence map. Int J Educ Technol High Educ 17 , 2 (2020). https://doi.org/10.1186/s41239-019-0176-8

Download citation

Received : 01 May 2019

Accepted : 17 December 2019

Published : 22 January 2020

DOI : https://doi.org/10.1186/s41239-019-0176-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Educational technology
  • Higher education
  • Systematic review
  • Evidence map
  • Student engagement

research proposal on student engagement

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.12(12); 2022 Dec

Real‐life research projects improve student engagement and provide reliable data for academics

Sarah a. marley.

1 Scotland's Rural College (SRUC), Aberdeen UK

2 School of Biological Sciences, University of Portsmouth, Portsmouth UK

Alessandro Siani

Stuart sims.

3 Academic & Learning Enhancement, Information & Library Services, University of Greenwich, London UK

Associated Data

The data used in this project are freely available from: http://10.6084/m9.figshare.20361861

Student engagement can have a positive influence on student success. Many methods exist for fostering engagement but tend to be generic and require tailoring to specific contexts, subjects, and students. In the case of undergraduate science students, practical classes are a popular tool for increasing engagement. However, despite strong potential for improvement via links with “real life” research projects (RLRPs), few academic staff incorporate research participation with teaching activities. This is potentially due to poor time availability and low opinions of students' ability to collect reliable data. This study aims to examine whether involvement with RLRPs can generate reliable scientific data and also act as a motivational tool for engaging tertiary science students. A preexisting core activity for first‐year biology and marine biology students was modified to include a short RLRP component. Student‐based data collection and a questionnaire about experiences were used to examine the reliability of student‐collected data and student perceptions of RLRPs. Results indicated that error rate in student‐collected data was minimal. Irrespective of participating in a “normal” practical class or a class with a RLRP component, students collected equally accurate data. However, when the topic aligned specifically with their degree subject, student accuracy was higher. All students surveyed reported high motivation with the idea of RLRP participation, placing high importance on this from an educational and employability perspective. Yet, students were not confident about participating in RLRPs until they had engaged with one, suggesting that introducing such projects into taught sessions early‐on may encourage students to seek further opportunities in the future. In conclusion, incorporating RLRPs into the curriculum of undergraduate science courses has considerable potential benefits for both students and academic staff.

Few academic staff involve undergraduate students in research activities, possibly due to low opinions of students' ability to collect reliable data. This study demonstrated that the error rate in student‐collected data was minimal and that students found the idea of participating in research projects highly motivating. Merging research and teaching activities has considerable potential benefits for both students and academics.

An external file that holds a picture, illustration, etc.
Object name is ECE3-12-e9593-g003.jpg

1. INTRODUCTION

Student success can be defined in many ways and influenced by a range of factors. Although numerous studies focus on academic achievement, success can also involve acquisition of general knowledge; development of competence, cognitive skills, and intellectual dispositions; preparation for adulthood and citizenship; and personal development (Braxton,  2006 ). Therefore, when considering student success, it is important to look beyond grades or degree attainment and also consider acquisition of desired knowledge, transferable skills, and competencies (Kuh et al.,  2007 ).

Regardless of how student success is defined, some of the key contributors are background characteristics such as ethnicity, family income, and first‐generation status (Lundberg et al.,  2007 ; Powell et al., 1990 ; Smith & White,  2015 ). These characteristics are beyond the control of educators; however, an additional aspect that contributes to student success is engagement. Student engagement is both an intrinsic and an extrinsic factor; it reflects the quality of effort students devote to educationally purposeful activities and the effort institutions devote to using effective educational practices (Kuh,  2001 ; Kuh et al.,  2007 ). It can also be considered as both an outcome and a process; as students become engaged, their involvement can promote ever greater engagement (Reschly & Christenson,  2012 ). Indeed, engagement can potentially gain a metacognitive aspect as students become aware of their own learning process (Haave,  2016 ; Hacker,  1998 ; Larmar & Lodge,  2014 ; Marra et al.,  2021 ; Tanner,  2012 ). Engagement is important because it is an aspect within the control of educators that can positively influence student success. For example, engagement is positively related to academic grades, critical thinking, and persistence among tertiary students (Carini et al.,  2006 ; Finn & Zimmer,  2012 ; Fraysier et al.,  2020 ; Kuh et al.,  2008 ; McCormick et al.,  2015 ; Schudde,  2019 ).

There are a number of existing proposals for fostering student engagement, which are typically categorized according to a four‐strand conceptual organizer (Zepke & Leach,  2010 ): (1) Motivation and agency, where engaged students are intrinsically motivated and want to exercise their agency; (2) transactional engagement, where students and teachers engage with each other; (3) institutional support, where institutions provide an environment conducive to learning; and (4) active citizenship, where students and institutions work together to enable challenges to social beliefs and practices. Institutional support and active citizenship are typically controlled at the institution level, whereas motivation and agency and transactional engagement can be more flexible as they are typically controlled at the educator level. Therefore, motivation, agency, and transactional engagement arguably offer some of the greatest opportunities for educators to foster student engagement.

1.1. Student motivation, agency, and transactional engagement

Motivation is of particular pedagogical interest, although the relationship between motivation and student success is complex. This is partly because motivation is defined as an “internal force that determines the goals of a person” (Sutherland,  1995 ). It is therefore a hypothetical construct that is difficult to test; consequently, it is often inferred from behavior (Breen & Lindsay,  1999 ). Furthermore, learning motivation can be intrinsic (e.g., striving to achieve understanding) or extrinsic (e.g., striving to obtain high grades) (Areepattamannil et al.,  2011 ). Typically, engaged students are intrinsically motivated (Zepke & Leach,  2010 ), and thus invest more effort and take greater care in their work. This is rewarded by higher student achievement and success in comparison with extrinsically motivated students (Areepattamannil et al.,  2011 ). In some cases, strong motivation can even offset student background characteristics to positively influence student success through heightened engagement (Allen,  1999 ).

Similarly, agency (i.e., control over one's learning activities) can also improve engagement. Agency allows students the opportunity to learn how to make decisions to successfully complete tasks, but it also fosters the motivation to persevere in the face of difficulties (Vaughn,  2020 ). When students are required to take responsibility for activities, they become invested in the activity and more committed to their studies (Kuh et al.,  2008 ). Agentic engagement has been shown to improve students' achievements and mitigate disengagement (Anderson et al.,  2019 ; Reeve & Tseng,  2011 ). Beyond individual outcomes, increased student agency can also facilitate broader societal outcomes, such as improved intercultural relationships, internationalism, and globalism (Stenalt & Lassesen,  2022 ). A key feature of agency is that it allows learning experiences to move beyond a transactional approach, fostering collaboration between students and educators (Vaughn,  2020 ).

Positive interactions between students and educators are central to successful engagement (Kuh et al.,  2006 ). Approachable teachers who create inviting learning environments are available to discuss student performance and offer student support are more likely to experience heightened student engagement, performance, retention, and loyalty (Bryson & Hand,  2007 ; Mearns et al.,  2007 ; Snijders et al.,  2020 ). Positive relationships with staff can also promote a sense of belonging within students, particularly those from ethnic minorities (Meeuwisse et al.,  2010 ). Students themselves also recognize that strong relationships with staff are important (Snijders et al.,  2018 ). It is therefore important to not only create chances for student–teacher interactions, but also allow opportunities for these to develop into quality relationships. Therefore, it is important for educators to design activities that promote engagement, intrinsic motivation, learner responsibility, and interaction between students and staff.

Proposals for improving student engagement that are linked to motivation, agency, and transactional engagement include the following: enhancing students' self‐belief; enabling students to work autonomously; recognizing that teaching and teachers are central to engagement; creating active‐learning environments; fostering collaborative learning; and generating challenging educational experiences (Russell & Slater,  2011 ; Zepke,  2013 ; Zepke & Leach,  2010 ). However, implementing these generic proposals can be challenging for educators. There is a need to investigate how these suggestions can be successfully integrated for specific subjects and students.

1.2. Engagement via practical classes

In the case of undergraduate science students, a teaching method that has been shown to increase student engagement is the use of practical classes (Charney et al.,  2007 ). Practicals can take a range of forms, from teacher‐led demonstrations to “recipe‐style” activities to independent research projects (Dunlop et al.,  2019 ). These activities offer an opportunity to develop conceptual knowledge, technical skills, and general scientific literacy (Areepattamannil et al.,  2011 ; Ferreira & Morais, 2020 ; Freedman,  1997 ; Hofstein & Lunetta,  2004 ; Millar & Abrahams,  2009 ; Tobin,  1990 ). Within practical classes, exercises commonly involve cooperative learning in groups, which also enhances competencies related to social interaction (Goldschmidt & Bogner,  2016 ). Students themselves recognize the importance of such classes, ranking “technical skills” of high importance and second only to learning the general theory of their subject (Edmondston et al.,  2010 ). They also find practicals useful and enjoyable compared with other science teaching and learning activities, with student enjoyment strongly linked to better learning (Abrahams & Millar,  2008 ; Kickert et al.,  2022 ). Practicals have positive links with the aforementioned concepts of student engagement. Class sizes are typically smaller than those of lectures, allowing more frequent and higher quality transactional engagement between students and educators. Furthermore, practicals allow development of student motivation and the opportunity to exercise agency. Consequently, practical classes have a distinctive and central role in science curricula (Goldschmidt & Bogner,  2016 ).

However, given the logistical complexities involved in planning practicals, these activities are at risk of being ineffective (Millar & Abrahams,  2009 ). Without a clear and precise purpose, activities can be ill‐conceived, confused, poorly explained, and unproductive (Hodson,  1991 ; Kulgemeyer,  2018 ). In such cases, practical work fails to foster scientific thinking skills and cognitive achievement (Abrahams & Millar,  2008 ). Practical activities also risk being overly prescriptive; this constrained agency can lead to dissatisfaction and underperformance (Stenalt & Lassesen,  2022 ). Students are quick to perceive activities as “meaningless” and thus attribute them to having low value (Abrahams & Reiss,  2012 ), which is a particular risk of nonassessed activities (Kickert et al.,  2022 ). This perception of meaningfulness and value is important for student motivation; when students perceive value in tasks, they will actively engage and use active‐learning strategies, whereas when students do not perceive value, they use surface‐learning strategies (Tuan et al.,  2005 ). Thus, there is a need to carefully plan practical classes to maximize their positive impacts and ensure such activities are valuable for students.

1.3. Engagement via “real life” research projects ( RLRPs )

One method for strengthening the value and benefits of practical classes is by creating links with academic research. Although many undergraduate science students complete a course‐based undergraduate research project in their final year, there is benefit to engaging students with research earlier in their degrees in order to facilitate progressive capacity development throughout the entire course (Brew,  2013 ; Clark & Hordosy,  2019 ; Howell,  2021 ). Staff research activity positively influences student learning (Guerin & Ranasinghe,  2010 ; Guo et al.,  2018 ; Hattie & Marsh,  1996 ; Jenkins et al.,  1998 ; Neumann,  1992 ; Ramsden & Moses,  1992 ), and involvement in academic‐led “real life” research projects (RLRPs) facilitates student‐centered learning in an authentic learning environment (Breen & Lindsay,  1999 ; Charney et al.,  2007 ; Healey et al.,  2010 ). The student is immersed in a collaborative environment where practical skills are connected to real science through meaningful tasks (Bigot‐Cormier & Berenguer,  2017 ; Charney et al.,  2007 ). Involvement in such projects has the potential to be motivating, provide opportunities for agency, and allow collaborations between students and researchers. Students thus perceive involvement in research as valuable, and even enjoyable. Indeed, previous studies have found an association between higher course satisfaction and students with positive attitudes toward research, which included students having an interest in research and wishing to participate in research (Breen & Lindsay,  1999 ; Healey et al.,  2010 ; Howell,  2021 ; Jenkins et al.,  1998 ). Involvement in RLRPs has also been reported to positively influence the future career decisions of students (Dunlop et al.,  2019 ).

Yet, few tertiary lecturers appear to incorporate research participation with teaching activities. This may be the result of poor time availability to design and trial such activities, a conflict in time between teaching and research, or low opinions of students' ability to collect accurate and reliable data and positively contribute to research outcomes (Brew & Mantai,  2017 ; Hattie & Marsh,  1996 ; Kloser et al.,  2011 ). Many of the challenges around engaging inexperienced students in research and data collection are shared by the citizen science movement. For example, data quality is widely seen as a problem for those working in citizen science, with a range of strategies undertaken to a ameliorate this, including close supervision, cross‐checking results, and simplifying tasks (Riesch & Potter,  2014 ). Indeed, Mitchell et al. ( 2017 ) discovered that engaging undergraduate students in citizen science actually decreased the students' own perception about the accuracy of measurements taken and the usefulness of such data, although engagement increased. However, while a wealth of data is collected annually as part of Course‐Based Undergraduate Research Experiences (CUREs), these observations very rarely go beyond the classroom, irrespective of their quality or originality (Messager et al.,  2022 ). Conversely, involving students in RLRPs allows them to participate in projects that have real‐world applications, proving the usefulness of research in general and their own skills in particular. Involving students in the research of teaching staff can also could be more time‐efficient for academics by providing additional opportunities for research and a higher number of “person hours” for data collection (Harland,  2016 ; Tight,  2016 ). Yet, many academic staff remain to be convinced of the quality of student‐collected data, to the detriment of their own research and student learning.

1.4. Study aims

This study aims to examine whether involvement with RLRPs can generate reliable scientific data and be used as a motivational tool for engaging tertiary science students. Specifically, this study will (1) measure the overall reliability of student‐collected data in practical classes; (2) compare the accuracy of student‐collected data in classes with and without a RLRP component; and (3) evaluate student perceptions of RLRPs. Findings will first demonstrate whether student‐collected data are of sufficient quality for inclusion in academic research projects, which will be of use to staff considering such data gathering activities. Findings will also indicate whether student motivation increases through involvement in research projects by comparing students participating in classes with and without a RLRP component, both through student surveys exploring perceptions and by comparing error rates of student‐collected data. This dual approach will allow examination of student motivation in terms of the way students think and the way they behave. Overall, these results will inform academics about the benefits of incorporating RLRPs into their activities, from both a research and a teaching perspective.

This study recruited first‐year students studying BSc Marine Biology and BSc Biology at the University of Portsmouth. As part of a core first‐year module designed to train students in a range of essential laboratory techniques, these students undertake a single‐instance practical to gain familiarity with dissection techniques and spotted dogfish ( Scyliorhinus canicula ) morphology. Students work in pairs, receive a specimen for examination, and are asked to complete a workbook regarding anatomical features.

Convenience sampling was conducted to recruit students from this practical to the current study. This utilized the lead author's previous experience with teaching this activity to ensure study design would not impact learning objectives. This study was undertaken in accordance with the University of Portsmouth Ethics Policy (No: {"type":"entrez-nucleotide","attrs":{"text":"ED182005","term_id":"112979599"}} ED182005 ). All participants were informed of the voluntary and anonymous nature of the study, and of their right to withdraw without any negative repercussions on achievement and progression.

2.1. Data collection

A visual overview of the data collection methodology is provided in Appendix  A . Due to the size of the first‐year student cohort (120 students), this practical class had three repeats over a three‐day period in January 2020. All classes were delivered by the lead author and supported by the same technician and demonstrating assistant. Each morning, dissection kits and specimens were prepared by the technician to allow one station per student pair. The stations were labeled sequentially, so that each dogfish had an individual identification number. Prior to the start of each class, measurements were taken by the lead author and two laboratory assistants for all dogfish (e.g., total length and fin length). This represented a “ground‐truthed” dataset with which to compare student‐collected measurements.

Approximately 40 students were timetabled to attend each class. The first class was timetabled to contain only Marine Biology students, the second class contained students from both degree streams, and the third class only contained Biology students. Therefore, “pure” classes were initially selected as Experimental Groups while mixed class was kept as a no‐treatment Control Group. Although the authors recognize that this does not represent an ideal experimental design, limited institutional resources and timetabling requirements restricted full educator control over this arrangement. This study limitation is further considered in the Discussion. Additionally, there was one case of a student attending the wrong day, resulting in a single mixed pair in one of the Experimental Groups (see Section  3 ).

Both Control and Experimental Groups had the same taught material to ensure no unfairness in terms of their education and learning outcomes. This included a description of sexual dimorphism (i.e., where two sexes of the same species exhibit different characteristics), which has been shown to exist in dogfish for a range of anatomical features (Filiz & Taskavak,  2006 ). This information was used to justify why students were recording measurements from their specimens. However, the Experimental Group was also told that their worksheets would be collected at the end of class to contribute to a scientific study investigating dogfish sexual dimorphism (which is indeed being conducted by the lead author); this was the only orchestrated difference between the Control and Experimental Groups. The importance of collecting accurate scientific measurements was emphasized to all students, regardless of their grouping.

A two‐sided worksheet was given to all pairs for in‐class completion and return. Each worksheet asked the pairs to indicate the degree stream they were from (Marine, Biology, or Both if a mixed pair) and the day of the week their class occurred. This information was collected to try and explain any underlying differences between students; for example, differing experiences between degree streams or communication between students on differing days. No additional background characteristics or demographic data were collected due to logistical challenges of ensuring student privacy while also linking such information to ground‐truthed measurement records. Additionally, given the relatively small class sizes, it is unlikely that sufficient sample sizes representative of different demographic groups would have been captured for statistical analysis. The front page of the worksheet was specific to dogfish measurements and group work; it contained a diagram of a dogfish and indicated eight sites along the body where measurements were to be recorded, along with details on the dogfish sex and ID number (Appendix  B ). The back page was specific to individual student motivation; it contained six Likert‐style questions (one set per student; Table  1 ) relating to individual perception of practical classes, their confidence in their own technical skills, and their opinions of involving students in RLRPs. In order to provide some context and further interrogate the impact of the experiments, open‐text comments were collected from students on the survey about their perception of undertaking the measurements. As these were a brief, adjunct to the research, they are not intended for rigorous qualitative analysis (LaDonna et al.,  2018 ). Rather, these comments were categorized based upon a descriptive interpretation of their core focus (e.g., “confidence”). Each statement could have multiple foci, for example, if a student talked about gaining confidence but also acknowledged the possible employment benefits of engaging in the activity. These different foci were then quantified to build an understanding of the range of different perceptions across the cohorts.

List of six Likert‐style questions that students were asked to complete at the end of their practical class.

Note : Question 6 was also followed by an open question seeking further explanation.

2.2. Data analysis

Data were analyzed in R (vr 4.2.1) using the packages car and dunn.test (Dinno,  2017 ; Fox & Weisberg,  2019 ). A significance level of 0.05 was used for all analyses (excluding those where a Bonferroni correction was applied).

Overall proportion of complete versus incomplete worksheets were compared by Treatment Group using a chi‐square test for associations. Student‐collected measurements were compared with ground‐truthed measurements to give an error rate; this was taken as an indication of accuracy. Data screening was undertaken using Shapiro–Wilk and Levene's tests; however, data were neither normal nor homogenous. Therefore, nonparametric tests were used to compare the error rate according to three explanatory variables (Table  2 ). Wilcoxon tests were used to compare the median error rate according to Treatment Group. Pair Composition and Measurement Site were investigated using Kruskal–Wallis tests and post hoc Dunn's tests with Bonferroni corrections.

Definitions of the three explanatory variables used during data analysis

Student surveys were analyzed in two parts, both of which utilized chi‐square tests as these are appropriate for both nominal and ordinal data (Kraska‐Miller,  2013 ; Sirkin,  2006 ). For each of the Likert‐style questions, chi‐square tests for associations were used to investigate the proportion of students who responded from 1 (Strongly Disagree) to 5 (Strongly Agree) in relation to Treatment Group (Table  2 ). The open question was manually reviewed to establish key themes in student responses, the occurrence of which were analyzed using chi‐square tests for association in relation to Treatment Group (Table  2 ).

3.1. Student‐collected data

A total of 53 worksheets were submitted. This included a total of 20 worksheets from the Control Group, with seven Marine Biology pairs, four Biology pairs, and nine pairs from Both. The Experimental Group had a total of 33 worksheets, comprised of 16 Marine Biology pairs, 16 Biology pairs, and one pair from Both.

The majority (81.1%; n  = 43) of worksheets were fully completed. Of the 10 worksheets that contained omissions, four did not indicate the dogfish sex and six were missing a morphometric measurement; only one worksheet had multiple omissions in the form of two missing measurements. The Control Group had the largest proportion of worksheets with omissions (25.0%; n  = 5), compared with 15.2% ( n  = 5) of the Experimental Group; however, this did not represent a significant difference between Treatment Groups ( χ 2  = 0.789, p  = .3744). Omitted measurements were not included in the dataset; however, the remaining measurements from that worksheet were retained for further analysis.

3.1.1. Dogfish sex

Of the 48 worksheets that recorded dogfish sex, the majority (97.9%; n  = 47) correctly sexed the animals. The one erroneous record occurred in the Control Group, where a female dogfish was mistakenly identified as a male.

3.1.2. Dogfish morphometrics

The morphometric measurements submitted via worksheets were compared with the “ground‐truthed” records to obtain an error rate. Of the 417 measurements reported, the majority (78.7%; n  = 328) of worksheet measurements were within ±1 cm of the ground‐truthed records; 50.8% ( n  = 212) were within ±0.5 cm; and 11.8% ( n  = 49) were within the minimum difference of ±0.1 cm (Figure  1a ). The maximum error was −17.7 cm; however, this appeared to be the result of the students measuring the dogfish in sections and then incorrectly summing the multiple lengths. When considered by Treatment Group, 76.6% ( n  = 121) of the Control and 79.9% ( n  = 207) of the Experimental measurements were within ±1 cm of the ground‐truthed records; 46.8% ( n  = 74) Control and 53.3% ( n  = 138) Experimental within ±0.5 cm; and 10.8% ( n  = 17) Control and 12.4% ( n  = 32) Experimental within ±0.1 cm (Figure  1b, c ).

An external file that holds a picture, illustration, etc.
Object name is ECE3-12-e9593-g004.jpg

Histogram of measurement errors for (a) all worksheets overall, (b) the Control Group, and (c) the Experimental Group. All data displayed in 0.1 cm bins

A Kruskal–Wallis test showed no significant difference in measurement error according to Dogfish ID ( χ 2  = 48.241, df = 52, p  = .6225); thus, error rates were not specific to particular dogfish specimens or student pairs. However, significant differences did exist according to Treatment Group, Pair Composition, and Measurement Site.

A Wilcoxon Test showed a significant difference in the median error according to Treatment Group ( W  = 23,690, p  = .007; Figure  2a ). The Control Group tended to over‐estimate measurements (median = 0.10 cm), while the Experimental Group under‐estimated (median = −0.10 cm).

An external file that holds a picture, illustration, etc.
Object name is ECE3-12-e9593-g001.jpg

Summary of measurement errors by (a) treatment group, (b) teaching day, (c) pair composition, and (d) dogfish measurement site. Significance level: *** ≤ 0.001; ** ≤ 0.01; * ≤ 0.05

A Kruskal–Wallis test indicated a significant difference in the median error according to Pair Composition ( χ 2  = 8.3201, df = 2, p  = .01561; Figure  2c ). A Dunn's test with Bonferroni corrections indicated that pairs mixed from Both degrees were significantly different to Biology‐only pairs ( Z  = −2.6731, p  = .0113), but there was no significant difference between Biology‐Marine (Z = −2.1253, p  = .0503) or Marine‐Both ( Z  = 1.0116, p  = .4676). The median errors were −0.10 cm for Biology, 0.00 cm for Marine, and 0.10 cm for Both.

A Kruskal–Wallis test indicated a significant difference in the median error according to Measurement Site on the dogfish body ( χ 2  = 26.355, df = 7, p  < .001; Figure  2d ). A Dunn's test with Bonferroni corrections indicated that the following parts were significantly different: Parts 6–1 ( Z  = −3.3041, p  = .0133), Parts 6–4 ( Z  = −4.2790, p  < .001), Parts 6–7 ( Z  = 3.3474, p  = .0114), and Parts 2–4 ( Z  = 3.1009, p  = .0270). The respective median errors for Parts 1 to 8 were − 0.20 cm, 0.10 cm, −0.05 cm, −0.70 cm, −0.05 cm, 0.20 cm, −0.30 cm, and 0.00 cm. Thus, measurements of Part 6 (mouth length) were significantly higher than Part 1, 4, and 7 (body and pectoral fin lengths); additionally, measurements of Part 2 (dorsal fin height) were significantly higher than those of Part 4 (total body length). It is worth noting that both Parts 6 and 2 were challenging areas to measure. Mouth length (Part 6) often had unclear boundaries, while dorsal fins (Part 2) are flimsy body parts that can be manipulated in different ways.

3.2. Student surveys

A total of 100 completed survey forms were returned. This included a total of 29 surveys from the Control Group (Pair Compositions: 13 Marine Biology; 5 Biology; 11 Both) and 71 surveys from the Experimental Group (Pair Compositions: 31 Marine Biology; 38 Biology; 2 Both).

3.2.1. Survey questions

The survey responses for all questions are summarized in Figure  3 . Overall, students were positive about practical classes supporting their learning (Question 1). They found this particular practical class enjoyable (Question 2), but were variable in regard to their confidence in their own technical skills (Question 3). Students found the idea of working on RLRPs motivating (Question 4) and thought it was important that students were involved in such projects (Question 6), although again had varying levels of confidence in their ability to participate in RLRPs (Question 5).

An external file that holds a picture, illustration, etc.
Object name is ECE3-12-e9593-g006.jpg

Proportion of responses by treatment group to Likert‐style questions regarding student perceptions of practical classes and ‘real life’ research projects. Responses ranged from 1 (Strongly Disagree) to 5 (Strongly Agree). Significant differences ( p  < .05) were found only in all questions bar Question 1.

It was not possible to investigate survey responses by degree stream, as this information not available due to the shared nature of the worksheet; however, responses could be considered by Treatment Group. When the survey responses were considered by Treatment Group, chi‐square tests showed no significant association between Treatment and score for Question 1 ( χ 2  = 0.368, df = 1, p  = .544). However, there was a significant association between Treatment and score for Question 2 ( χ 2  = 6.253, df = 3, p  = .044), Question 3 ( χ 2  = 18.438, df = 3, p  < .001), Question 4 ( χ 2  = 8.612, df = 2, p  = .013), Question 5 ( χ 2  = 28.928, df = 3, p  < .001), and Question 6 ( χ 2  = 6.839, df = 2, p  = .033) (Figure  3 ). In particular, the Experimental Group showed stronger motivation and greater confidence from working on RLRPs, and also more strongly agreed that it was important for students to be involved with such projects. However, despite being less confident in their ability to participate in RLRPs, the Control Group appeared to enjoy this particular practical class more than the Experimental Group.

3.2.2. Survey themes

A total of 66 students provided 90 answers to the open question, the majority (97.8%; n  = 88) of the responses spoke positively of student involvement in RLRPs. Only two responses indicating negative opinions, which both indicated that RLRPs could be useful but that students should not be required or forced into participating in such projects.

Of the 88 positive written comments, seven key themes emerged: Experience; Confidence; Comprehension; Cost Effectiveness; Skills; Careers; and Engagement (see Table  3 for examples). Chi‐square tests found no significant association between theme and Treatment Group ( χ 2  = 10.201, df = 7, p  = .177). It was not possible to investigate the influence of degree stream on themes, as these data were not available at the individual level.

Examples of student responses to the open question asking why it is important for students to be involved in real‐life research projects, which were then categorized into seven key themes.

Note : The percentage under each theme name reflect the proportion of written comments that aligned with that theme.

However, chi‐square tests did indicate a significant difference by theme alone ( χ 2  = 42.182, df = 7, p  < .01; Table  3 ). The most popular themes reflected students remarking that participating in RLRPs allowed them relevant work and research experience (Experience; 26.1%); improved their knowledge and understanding of the theory (Comprehension; 22.7%); and increased their perceived employability and awareness of career options (Careers; 20.5%). Students also commented that such projects made the subject more interesting and memorable, improved their engagement, and encouraged them to try harder (Engaged; 10.2%); improved their practical abilities (Skills; 10.2%); and improved their confidence in those abilities (Confidence; 6.8%). Finally, there were some remarks upon the cost‐effectiveness for research budgets to have student volunteers collecting data (Cost‐Effective; 3.4%).

4. DISCUSSION

This study aimed to examine whether involvement with RLRPs can generate reliable scientific data and function as a motivational tool for engaging tertiary science students. Results indicated that error rate in student‐collected data was minimal; dogfish were correctly sexed on 98% of occasions and 79% of measurements were within ±1 cm of ground‐truthed records. However, differences existed between Control and Experimental Groups, as well as by degree type and measurement site. Interestingly, students were more accurate where the project topic aligned specifically with their degree subject. In terms of student perceptions, surveyed students reported that they found the idea of participating in RLRP motivating, thought it was an important component of their education, and placed strong value on such involvement for future employability. Yet, students who had not participated in a RLRP felt less confident about doing so. Overall, RLRPs are well‐perceived by students, and there is strong potential for students to contribute quality scientific data to such projects.

4.1. Student‐based data collection

Staff attitudes toward engaging students in research has the potential to limit research‐based curricula (Brew,  2013 ). Few academics utilize undergraduate student involvement in RLRPs, possibly as a result of concerns regarding students' ability to collect reliable data and limited perceptions of how students develop research capability (Hattie & Marsh,  1996 ; Wilson et al.,  2012 ). However, this study revealed an overall high degree of accuracy for student‐collected data. Therefore, involving students in RLRP could be advantageous to staff, particularly with regard to time efficiency through providing additional opportunities for research and a higher number of “person hours” for data collection. This also adds an incentive to the development of research‐based learning where students are encouraged and facilitated to undertake research and inquiry (Healey & Jenkins,  2009 ).

It is worth noting, however, that accuracy varied according to several factors. The error rate was significantly different by Treatment Group; but both the Control and the Experimental Groups were equally erroneous. However, when considered by degree stream, Marine Biology students averaged a zero error rate. Given the marine‐theme of the task, it may be that those students held greater interest or placed greater value on this activity than Biology students (who include those intending to specialize in topics such as botany, genetics, and microbiology, as well as zoology). Furthermore, some measurements appeared more difficult to collect than others, with dogfish mouth and total body length having particularly large error rates. The former is tricky to measure, while the latter required maths skills due to rulers of insufficient length. There was also variation among students, with some pairs having median error rates closer to zero than others; however, this difference was not statistically significant.

In summary, student‐collected data are reliable. However, it is essential to give students clear instructions and may be beneficial if the RLRP is a topic of direct interest or relevance to their specialist areas.

4.2. Student perceptions of “real life” research projects ( RLRPs )

If they are to engage in research‐based learning, students holding positive attitudes toward research is critically important (Brew,  2013 ). In this study, students were extremely positive about practicals in general and this class in particular, and considered practicals valuable to learning. However, students reported low confidence in their own abilities. This may reflect their status as first‐year undergraduates, as well as the difference between “doing” a practical and actually “learning” from it (Abrahams & Millar,  2008 ).

Irrespective of Treatment Group, all students were positive regarding the benefits of RLRP and thought it was important for students to be involved with such projects. This demonstrates that all students were aware of the potential benefits of participating in RLRPs, even without being involved in one. Thus, it was not the case that by being involved in a RLRP students were biased toward perceiving value. Rather, nearly all students perceived participating in RLRPs to provide relevant experience, improve theoretical comprehension, and enhance employability. This mirrors other research regarding involvement in extracurricular activities, which employers value in employee selection and graduates value for ongoing professional and personal development (Stuart et al.,  2011 ).

While both Treatment Groups found the idea of participating in RLRPs motivating, students who had participated in this study's RLRP reported stronger motivation and greater confidence in their abilities than their peers. Students who had participated in this study's RLRP also reported greater confidence in participating in such projects. Despite the general consensus that RLRPs are beneficial to both current learning and future employability, this means that students may not have the confidence to engage with RLRPs independently. It may be that exposing students to RLRPs within a classroom environment will grow their confidence and encourage them to seek additional opportunities in the future, thus facilitating enhanced learning and career development.

4.3. Study impact

This study supports the importance of the research‐teaching nexus (Brew & Boud,  1995 ; Harland,  2016 ; Healey et al.,  2010 ; Jenkins et al.,  1998 , 2003 ; Neumann,  1994 ; Tight,  2016 ). Strengthening the links between research and teaching is important because universities have a responsibility to prepare students for professional life and research‐based learning curricula provide countless opportunities for students to develop key, discipline‐specific skills (Brew,  2013 ). Thus, there is a need to develop knowledge‐building communities within universities and shift students from traditional roles as consumers of knowledge into active producers of knowledge (Brew,  2006 ; Neary,  2010 ). This can be achieved by creating research partnerships between academics and students; for example, academic‐directed inquiry allowing students to engage in both acquiring existing and creating new knowledge (Levy & Petrulis,  2012 ). Such partnerships also enable closer contact with more intangible aspects of learning (e.g., critical approaches to knowledge, positive attitudes to learning, fostering curiosity and enjoyment; Neumann,  1994 ).

Involvement in RLRPs improves student motivation, provides the opportunity for responsibility and agency, and fosters collaboration between students and academics. By increasing student engagement through involvement in RLRPs, there is a justification for incorporating such involvement in curriculum development. “Educators have the privilege to shape curricula, and thereby create their students' motivational context” (Kickert et al.,  2022 ). Additionally, given that data collected by students has a high degree of accuracy, these findings should encourage academics to actively involve undergraduate students in their research.

Practical classes have previously demonstrated success in increasing engagement among undergraduate science students (Charney et al.,  2007 ). However, there is the potential to accentuate these benefits using RLRPs as authentic learning environments (Breen & Lindsay,  1999 ; Charney et al.,  2007 ). This study built upon an existing lesson plan by adding RLRP components to a first‐year undergraduate practical class. Although these components were relatively basic in terms of broader scientific skills, framing in the context of a RLRP added value to routine tasks that could otherwise be misinterpreted as “meaningless” by students (Abrahams & Reiss,  2012 ; Tuan et al.,  2005 ). Furthermore, rather than research experiences being reserved for final‐year students or those fortunate enough (or confident enough) to participate in dedicated extracurricular programs, it is important to integrate research‐based learning experiences at earlier stages so that student capacities can be progressive developed throughout their degrees (Brew,  2013 ; Clark & Hordosy,  2019 ; Howell,  2021 ). Accentuating standard first‐year practical classes through inclusion of a RLRP component has the potential to capture a larger number of students, overcome background characteristics that limit participation in individual research experiences, build student confidence, and create a solid foundation of research skills right from the start of the degree that can be built upon over time.

From a logistical standpoint, these additions to the lesson plan contributed approximately an extra 15 min to a 2.5‐h class. Despite a relatively small time‐investment, this resulted in higher levels of student engagement. It would therefore be feasible to consider such additions across the broader curriculum. Furthermore, these additions actually offer a time‐saving measure to academics. Many staff run practical classes directly related to their area of research, yet relatively few of these are linked with active research projects. Given the accuracy of student‐collected data revealed in this study, it is recommended that academics develop teaching materials in closer alignment with their own research objectives. This would benefit staff in terms of additional resources for data collection and subsequent increases in academic output. For instance, by collecting dogfish morphological measurements over several years, the lead author anticipates being able to investigate changes in physical characteristics and growth rates linked to sex, age, geography, fishing activity, and climate change. Thus, by aligning research and teaching, there is a heightened potential for discipline‐specific staff outputs. Students would also benefit from a dynamic curriculum that reflects current research needs, provides training in applied skills, and enhances student success.

In the long term, incorporating student involvement in RLRPs throughout the undergraduate science curriculum has the potential to positively influence course satisfaction ratings, student employability, and staff research outputs (Breen & Lindsay,  1999 ; Dunlop et al.,  2019 ). We recommend academics examine their own research activities to identify what tasks exist that would align with undergraduate skills, and consider whether such tasks could be incorporated into teaching activities to offer hands‐on experience while also generating large‐scale and long‐term data sets.

4.4. Limitations and future work

The key limitation of this study is that it was restricted to a relatively small sample size from a single institution. It is therefore best viewed as a foundation for future research, which will broaden the impact of this work to allow application at a national or international level. In the first instance, it would be useful to reconfigure the present study in a traditional experimental format (i.e., equal number of control and experimental groups), with extended data on student demographics and background characteristics, and repeat this with multiple cohorts undertaking the same practical class over a number of years to create a larger and broader dataset. In a wider context, it would be interesting to replicate the study at different institutions, degrees, and class types. Longer term, it would be beneficial to understand the longevity of benefits arising from involvement in RLRPs, as well as staff perceptions.

It is also worth considering that there are many factors that contribute toward student engagement (Braxton,  2006 ; Kuh et al.,  2007 ; Zepke & Leach,  2010 ). These can be intrinsic, such as student gender, race, ethnicity, and socio‐economic status; or they may be extrinsic and relate to aspects of the teacher, classroom, or institution. Many of these do not exist singularly, but instead interact intersectionally to enhance engagement or trigger disengagement. Some of these factors were controlled within this study; for example, the class was taught by the same person, in the same room, within the same institution. Some aspects even contributed to the study and were statistically assessed (e.g., degree type). However, inevitably, there were still a variety of measures beyond those considered here that could have influenced student engagement. Thus, this project provides a piece of the overall student engagement puzzle, which can be built upon in future studies.

5. CONCLUSION

This research demonstrates that students can collect accurate data for scientific research. In particular, student accuracy was greatest when the task aligned with their degree topic. This offers an opportunity to academics looking to build long‐term research projects requiring high people hours for measurement‐based tasks.

All surveyed students enjoyed this practical task, felt motivated by the idea of working on RLRPs, were keen to gain relevant work experience, and valued RLRPs in terms of enhancing their own comprehension and future employability. However, students who had participated in the RLRP component of this study showed greater motivation and confidence than their peers. Interestingly, unless students had participated in a RLRP, they did not feel confident about engaging with such projects. Thus, it is recommended that RLRPs are embedded early into the tertiary curricula to develop student confidence in contributing to “real life” research and encourage future engagement with such professional development opportunities.

In conclusion, incorporating RLRPs into the curriculum of undergraduate science students can have considerable benefits for both students and academic staff. For greatest success, we recommend that such projects are carefully designed, clearly implemented, and directly overlap with both student interests and staff expertise. Therefore, we invite academics to review their own research practices to identify how they can align more closely with teaching activities and student learning opportunities.

AUTHOR CONTRIBUTIONS

Sarah A. Marley: Conceptualization (lead); data curation (lead); formal analysis (lead); investigation (lead); methodology (lead); software (lead); visualization (lead); writing – original draft (lead); writing – review and editing (equal). Alessandro Siani: Conceptualization (supporting); methodology (supporting); writing – review and editing (equal). Stuart Sims: Conceptualization (supporting); methodology (supporting); supervision (lead); writing – review and editing (equal).

CONFLICT OF INTEREST

The authors declare no competing financial or nonfinancial interests that are directly or indirectly related to this work.

OPEN RESEARCH BADGES

This article has earned Open Data and Open Materials badges. Data and materials are available at http://10.6084/m9.figshare.20361861 .

ACKNOWLEDGMENTS

This project would not have been possible without the support of several key people. We would like to thank Marc Martin and Christine Hughes for dogfish‐sourcing and technical support. Gemma Scotts and Michael Scales kindly assisted in ground‐truthing the measurements of many, many dogfish. James Robbins assisted during the practical classes, keeping staff, students, and dogfish in line. Finally, we would like to thank the two anonymous reviewers of this manuscript for their thoughtful feedback and positive encouragement.

APPENDIX A. A conceptual infographic outlining the study design and methods.

An external file that holds a picture, illustration, etc.
Object name is ECE3-12-e9593-g002.jpg

APPENDIX B. Student worksheet for recording the sex and eight different measurement sites (red numbers) of dogfish specimens. Worksheets were completed within pairs.

An external file that holds a picture, illustration, etc.
Object name is ECE3-12-e9593-g005.jpg

Marley, S. A. , Siani, A. , & Sims, S. (2022). Real‐life research projects improve student engagement and provide reliable data for academics . Ecology and Evolution , 12 , e9593. 10.1002/ece3.9593 [ CrossRef ] [ Google Scholar ]

DATA AVAILABILITY STATEMENT

  • Abrahams, I. , & Millar, R. (2008). Does practical work really work? A study of the effectiveness of practical work as a teaching and learning method in school science . International Journal of Science Education , 30 ( 14 ), 1945–1969. [ Google Scholar ]
  • Abrahams, I. , & Reiss, M. J. (2012). Practical work: Its effectiveness in primary and secondary schools in England . Journal of Research in Science Teaching , 49 ( 8 ), 1035–1055. [ Google Scholar ]
  • Allen, D. (1999). Desire to finish college: An empirical link between motivation and persistence . Research in Higher Education , 40 ( 4 ), 461–485. [ Google Scholar ]
  • Anderson, R. C. , Graham, M. , Kennedy, P. , Nelson, N. , Stoolmiller, M. , Baker, S. K. , & Fien, H. (2019). Student agency at the crux: Mitigating disengagement in middle and high school . Contemporary Educational Psychology , 56 , 205–217. [ Google Scholar ]
  • Areepattamannil, S. , Freeman, J. G. , & Klinger, D. A. (2011). Influence of motivation, self‐beliefs, and instructional practices on science achievement of adolescents in Canada . Social Psychology of Education , 14 ( 2 ), 233–259. [ Google Scholar ]
  • Bigot‐Cormier, F. , & Berenguer, J. L. (2017). How students can experience science and become researchers: Tracking MERMAID floats in the oceans . Seismological Research Letters , 88 ( 2A ), 416–420. [ Google Scholar ]
  • Braxton, J. M. (2006). Faculty professional choices in teaching that foster student success . National Postsecondary Education Cooperative. [ Google Scholar ]
  • Breen, R. , & Lindsay, R. (1999). Academic research and student motivation . Studies in Higher Education , 24 ( 1 ), 75–93. [ Google Scholar ]
  • Brew, A. (2006). Learning to develop the relationship between research and teaching at an institutional level . New Directions for Teaching and Learning , 107 , 11–22. [ Google Scholar ]
  • Brew, A. (2013). Understanding the scope of undergraduate research: A framework for curricular and pedagogical decision‐making . Higher Education , 66 ( 5 ), 603–618. [ Google Scholar ]
  • Brew, A. , & Boud, D. (1995). Teaching and research: Establishing the vital link with learning . Higher Education , 29 ( 3 ), 261–273. [ Google Scholar ]
  • Brew, A. , & Mantai, L. (2017). Academics' perceptions of the challenges and barriers to implementing research‐based experiences for undergraduates . Teaching in Higher Education , 22 ( 5 ), 551–568. [ Google Scholar ]
  • Bryson, C. , & Hand, L. (2007). The role of engagement in inspiring teaching and learning . Innovations in Education and Teaching International , 44 ( 4 ), 349–362. [ Google Scholar ]
  • Carini, R. M. , Kuh, G. D. , & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages . Research in Higher Education , 47 ( 1 ), 1–32. [ Google Scholar ]
  • Charney, J. , Hmelo‐Silver, C. E. , Sofer, W. , Neigeborn, L. , Coletta, S. , & Nemeroff, M. (2007). Cognitive apprenticeship in science through immersion in laboratory practices . International Journal of Science Education , 29 ( 2 ), 195–213. [ Google Scholar ]
  • Clark, T. , & Hordosy, R. (2019). Undergraduate experiences of the research/teaching nexus across the whole student lifecycle . Teaching in Higher Education , 24 ( 3 ), 412–427. [ Google Scholar ]
  • Dinno, A. (2017). dunn.test: Dunn's test of multiple comparisons using rank sums . R Package version 1.3.5. https://CRAN.R‐project.org/package=dunn.test
  • Dunlop, L. , Knox, K. J. , Bennett, J. M. , Reiss, M. , & Torrance Jenkins, R. (2019). Students becoming researchers . School Science Review , ( 372 ), 85–91. [ Google Scholar ]
  • Edmondston, J. E. , Dawson, V. , & Schibeci, R. (2010). Undergraduate biotechnology students' views of science communication . International Journal of Science Education , 32 ( 18 ), 2451–2474. [ Google Scholar ]
  • Ferreira, S. , & Morais, A. M. (2020). Practical work in science education: Study of different contexts of pedagogic practice . Research in Science Education , 50 ( 4 ), 1547–1574. [ Google Scholar ]
  • Filiz, H. , & Taskavak, E. (2006). Sexual dimorphism in the head, mouth, and body morphology of the smallspotted catshark, Scyliorhinus canicula (Linnaeus, 1758) (Chondrichthyes: Scyliorhinidae) from Turkey . Acta Adriatica: International Journal of Marine Sciences , 47 ( 1 ), 37–47. [ Google Scholar ]
  • Finn, J. D. , & Zimmer, K. S. (2012). Student engagement: What is it? Why does it matter? In Handbook of research on student engagement (pp. 97–131). Springer. [ Google Scholar ]
  • Fox, J. , & Weisberg, S. (2019). An R companion to applied regression (3rd ed.). Sage Publishing. [ Google Scholar ]
  • Fraysier, K. , Reschly, A. , & Appleton, J. (2020). Predicting postsecondary enrollment with secondary student engagement data . Journal of Psychoeducational Assessment , 38 ( 7 ), 882–899. [ Google Scholar ]
  • Freedman, M. P. (1997). Relationship among laboratory instruction, attitude toward science, and achievement in science knowledge . Journal of Research in Science Teaching , 34 ( 4 ), 343–357. [ Google Scholar ]
  • Goldschmidt, M. , & Bogner, F. X. (2016). Learning about genetic engineering in an outreach laboratory: Influence of motivation and gender on students' cognitive achievement . International Journal of Science Education, Part B , 6 ( 2 ), 166–187. [ Google Scholar ]
  • Guerin, C. , & Ranasinghe, D. C. (2010). Why I wanted more: inspirational experiences of the teaching‐research nexus for engineering undergraduates . University Teaching & Learning Practice , 7 ( 2 ), 8. [ Google Scholar ]
  • Guo, X. , Loy, K. , & Banow, R. (2018). Can first‐year undergraduate geography students do individual research? Journal of Geography in Higher Education , 42 ( 3 ), 412–426. [ Google Scholar ]
  • Haave, N. (2016). E‐portfolios rescue biology students from a poorer final exam result: Promoting student metacognition . Bioscene: Journal of College Biology Teaching , 42 ( 1 ), 8–15. [ Google Scholar ]
  • Hacker, D. J. (1998). Definitions and empirical foundations . Metacognition in Educational Theory and Practice , 14 , 1–23. [ Google Scholar ]
  • Harland, T. (2016). Teaching to enhance research . Higher Education Research & Development , 35 ( 3 ), 461–472. [ Google Scholar ]
  • Hattie, J. , & Marsh, H. W. (1996). The relationship between research and teaching: A meta‐analysis . Review of Educational Research , 66 ( 4 ), 507–542. [ Google Scholar ]
  • Healey, M. , & Jenkins, A. (2009). Developing undergraduate research and inquiry . Higher Education Academy. [ Google Scholar ]
  • Healey, M. , Jordan, F. , Pell, B. , & Short, C. (2010). The research–teaching nexus: a case study of students' awareness, experiences and perceptions of research . Innovations in Education and Teaching International , 47 ( 2 ), 235–246. [ Google Scholar ]
  • Hodson, D. (1991). Practical Work in Science: Time for a Reappraisal . Studies in Science Education , 19 ( 1 ), 175–184. [ Google Scholar ]
  • Hofstein, A. , & Lunetta, V. N. (2004). The laboratory in science education: Foundations for the twenty‐first century . Science Education , 88 ( 1 ), 28–54. [ Google Scholar ]
  • Howell, K. (2021). Enhancing research and scholarly experiences based on students' awareness and perception of the research‐teaching nexus: A student‐centred approach . PLoS One , 16 ( 9 ), e0257799. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Jenkins, A. , Blackman, T. , Lindsay, R. , & Paton‐Saltzberg, R. (1998). Teaching and research: Student perspectives and policy implications . Studies in Higher Education , 23 ( 2 ), 127–141. [ Google Scholar ]
  • Jenkins, A. , Breen, R. , Lindsay, R. , & Brew, A. (2003). Reshaping teaching in higher education: Linking teaching with research . Kogan Page. [ Google Scholar ]
  • Kickert, R. , Meeuwisse, M. , Stegers‐Jager, K. M. , Prinzie, P. , & Arends, L. R. (2022). Curricular fit perspective on motivation in higher education . Higher Education , 83 ( 4 ), 729–745. [ Google Scholar ]
  • Kloser, M. J. , Brownell, S. E. , Chiariello, N. R. , & Fukami, T. (2011). Integrating teaching and research in undergraduate biology laboratory education . PLoS Biology , 9 ( 11 ), e1001174. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kraska‐Miller, M. (2013). nonparametric statistics for social and behavioral sciences (p. 260). CRC Press. [ Google Scholar ]
  • Kuh, G. D. (2001). Assessing what really matters to student learning inside the national survey of student engagement . Change: The Magazine of Higher Learning , 33 ( 3 ), 10–17. [ Google Scholar ]
  • Kuh, G. D. , Cruce, T. M. , Shoup, R. , Kinzie, J. , & Gonyea, R. M. (2008). Unmasking the effects of student engagement on first‐year college grades and persistence . Journal of Higher Education , 79 ( 5 ), 540–563. [ Google Scholar ]
  • Kuh, G. D. , Kinzie, J. , Buckley, J. , Bridges, B. , & Hayek, J. C. (2007). Piecing together the student success puzzle: Research, propositions, and recommendations . ASHE Higher Education Report , 32 ( 5 ), 97–100. [ Google Scholar ]
  • Kuh, G. D. , Kinzie, J. L. , Buckley, J. A. , Bridges, B. K. , & Hayek, J. C. (2006). What matters to student success: A review of the literature (Vol. 8 ). National Postsecondary Education Cooperative. [ Google Scholar ]
  • Kulgemeyer, C. (2018). Towards a framework for effective instructional explanations in science teaching . Studies in Science Education , 54 ( 2 ), 109–139. [ Google Scholar ]
  • LaDonna, K. A. , Taylor, T. , & Lingard, L. (2018). Why open‐ended survey questions are unlikely to support rigorous qualitative insights . Academic Medicine , 93 ( 3 ), 347–349. [ PubMed ] [ Google Scholar ]
  • Larmar, S. , & Lodge, J. M. (2014). Making sense of how I learn: Metacognitive capital and the first year university student . Student Success , 5 ( 1 ), 93. [ Google Scholar ]
  • Levy, P. , & Petrulis, R. (2012). How do first‐year university students experience inquiry and research, and what are the implications for the practice of inquiry‐based learning? Studies in Higher Education , 37 ( 1 ), 85–101. [ Google Scholar ]
  • Lundberg, C. A. , Schreiner, L. A. , Hovaguimian, K. , & Slavin Miller, S. (2007). First‐generation status and student race/ethnicity as distinct predictors of student involvement and learning . NASPA Journal , 44 ( 1 ), 57–83. [ Google Scholar ]
  • Marra, R. M. , Hacker, D. J. , & Plumb, C. (2021). Metacognition and the development of self‐directed learning in a problem‐based engineering curriculum . Journal of Engineering Education , 111 ( 1 ), 137–161. [ Google Scholar ]
  • McCormick, N. J. , Clark, L. M. , & Raines, J. M. (2015). Engaging students in critical thinking and problem solving: A brief review of the literature . Journal of Studies in Education , 5 ( 4 ), 100–113. [ Google Scholar ]
  • Mearns, K. , Meyer, J. , & Bharadwaj, A. (2007, January). Student engagement in human biology practical sessions. In Refereed paper presented at the Teaching and Learning Forum . Curtin University of Technology. http://otl.curtin.edu.au/tlf/tlf2007/refereed/mearns.html [ Google Scholar ]
  • Meeuwisse, M. , Severiens, S. E. , & Born, M. P. (2010). Learning environment, interaction, sense of belonging and study success in ethnically diverse student groups . Research in Higher Education , 51 ( 6 ), 528–545. [ Google Scholar ]
  • Messager, M. L. , Comte, L. , Couto, T. B. , Koontz, E. D. , Kuehne, L. M. , Rogosch, J. S. , Stiling, R. R. , & Olden, J. D. (2022). Course‐based undergraduate research to advance environmental education, science, and resource management . Frontiers in Ecology and the Environment , 20 ( 7 ), 431–440. [ Google Scholar ]
  • Millar, R. , & Abrahams, I. (2009). Practical work: making it more effective . School Science Review , 91 ( 334 ), 59–64. [ Google Scholar ]
  • Mitchell, N. , Triska, M. , Liberatore, A. , Ashcroft, L. , Weatherill, R. , & Longnecker, N. (2017). Benefits and challenges of incorporating citizen science into university education . PLoS One , 12 ( 11 ), e0186285. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Neary, M. (2010). Student as producer: Pedagogy for the Avant Garde . Learning Exchange , 1 ( 1 ), 1–17. [ Google Scholar ]
  • Neumann, R. (1992). Perceptions of the teaching‐research nexus: A framework for analysis . Higher Education , 23 ( 2 ), 159–171. [ Google Scholar ]
  • Neumann, R. (1994). The teaching‐research nexus: Applying a framework to university students' learning experiences . European Journal of Education , 29 ( 3 ), 323–338. [ Google Scholar ]
  • Powell, R. , Conway, C. , & Ross, L. (1990). Effects of student predisposing characteristics on student success . International Journal of E‐Learning & Distance Education , 5 ( 1 ), 5–19. [ Google Scholar ]
  • Ramsden, P. , & Moses, I. (1992). Associations between research and teaching in Australian higher education . Higher Education , 23 ( 3 ), 273–295. [ Google Scholar ]
  • Reeve, J. , & Tseng, C. M. (2011). Agency as a fourth aspect of students' engagement during learning activities . Contemporary Educational Psychology , 36 ( 4 ), 257–267. [ Google Scholar ]
  • Reschly, A. L. , & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In Christenson S. L., Reschly A. L., & Wylie C. (Eds.), Handbook of research on student engagement (pp. 3–20). Springer. [ Google Scholar ]
  • Riesch, H. , & Potter, C. (2014). Citizen science as seen by scientists: Methodological, epistemological and ethical dimensions . Public Understanding of Science , 23 ( 1 ), 107–120. [ PubMed ] [ Google Scholar ]
  • Russell, B. , & Slater, G. R. (2011). Factors that encourage student engagement: Insights from a case study of ‘First Time’ students in a New Zealand University . Journal of University Teaching & Learning Practice , 8 ( 1 ), 7–96. [ Google Scholar ]
  • Schudde, L. (2019). Short‐and long‐term impacts of engagement experiences with faculty and peers at community colleges . The Review of Higher Education , 42 ( 2 ), 385–426. [ Google Scholar ]
  • Sirkin, R. M. (2006). Statistics for the social sciences (p. 610). SAGE Publications. [ Google Scholar ]
  • Smith, E. , & White, P. (2015). What makes a successful undergraduate? The relationship between student characteristics, degree subject and academic success at university . British Educational Research Journal , 41 ( 4 ), 686–708. [ Google Scholar ]
  • Snijders, I. , Rikers, R. M. , Wijnia, L. , & Loyens, S. M. (2018). Relationship quality time: the validation of a relationship quality scale in higher education . Higher Education Research & Development , 37 ( 2 ), 404–417. [ Google Scholar ]
  • Snijders, I. , Wijnia, L. , Rikers, R. M. , & Loyens, S. M. (2020). Building bridges in higher education: Student‐faculty relationship quality, student engagement, and student loyalty . International Journal of Educational Research , 100 , 101538. [ Google Scholar ]
  • Stenalt, M. H. , & Lassesen, B. (2022). Does student agency benefit student learning? A systematic review of higher education research . Assessment & Evaluation in Higher Education , 47 ( 5 ), 653–669. [ Google Scholar ]
  • Stuart, M. , Lido, C. , Morgan, J. , Solomon, L. , & May, S. (2011). The impact of engagement with extracurricular activities on the student experience and graduate outcomes for widening participation populations . Active Learning in Higher Education , 12 ( 3 ), 203–215. [ Google Scholar ]
  • Sutherland, S. (1995). The Macmillan dictionary of psychology . Macmillan. [ Google Scholar ]
  • Tanner, K. D. (2012). Promoting student metacognition . CBE—Life Sciences Education , 11 ( 2 ), 113–120. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Tight, M. (2016). Examining the research/teaching nexus . European Journal of Higher Education , 6 ( 4 ), 293–311. [ Google Scholar ]
  • Tobin, K. (1990). Research on science laboratory activities: In pursuit of better questions and answers to improve learning . School Science and Mathematics , 90 ( 5 ), 403–418. [ Google Scholar ]
  • Tuan, H. L. , Chin, C. C. , & Shieh, S. H. (2005). The development of a questionnaire to measure students' motivation towards science learning . International Journal of Science Education , 27 ( 6 ), 639–654. [ Google Scholar ]
  • Vaughn, M. (2020). What is student agency and why is it needed now more than ever? Theory Into Practice , 59 ( 2 ), 109–118. [ Google Scholar ]
  • Wilson, A. , Howitt, S. , Wilson, K. , & Roberts, P. (2012). Academics' perceptions of the purpose of undergraduate research experiences in a research‐intensive degree . Studies in Higher Education , 37 ( 5 ), 513–526. [ Google Scholar ]
  • Zepke, N. (2013). Student engagement: A complex business supporting the first year experience in tertiary education . International Journal of the First Year in Higher Education , 4 ( 2 ), 1–14. [ Google Scholar ]
  • Zepke, N. , & Leach, L. (2010). Improving student engagement: Ten proposals for action . Active Learning in Higher Education , 11 ( 3 ), 167–177. [ Google Scholar ]

Promoting Student Engagement in Online Education: Online Learning Experiences of Dutch University Students

  • Original Research
  • Open access
  • Published: 27 February 2024

Cite this article

You have full access to this open access article

  • Emma J. Vermeulen   ORCID: orcid.org/0000-0002-4363-4164 1 &
  • Monique L. L. Volman 1  

866 Accesses

10 Altmetric

Explore all metrics

Student engagement is an important factor in higher education learning, but engaging students in online learning settings has been found to be challenging. Little research has been conducted yet into how online learning activities can engage students. In this study, students’ experiences with online education were examined during the COVID-19 pandemic to find out what online learning activities promoted their engagement and what underlying engagement mechanisms informed those activities. Six online focus groups were held via Zoom with students ( N  = 25) from different social sciences programs at the University of Amsterdam. Findings revealed synchronous and asynchronous online learning activities that stimulated three dimensions of engagement and their underlying mechanisms. Behavioral engagement was stimulated through activities that promote attention and focus, inspire effort, break barriers, and provide flexibility. Affective engagement was stimulated through activities that promote a group feeling, encourage interaction, and create a sense of empathy and trust. And cognitive engagement was stimulated through activities that generate discussion and personalization. This research provides teachers with insights into how to promote student engagement in online education.

Similar content being viewed by others

research proposal on student engagement

Zoom-ing Past “the New Normal”? Understanding Students’ Engagement with Online Learning in Higher Education during the COVID-19 Pandemic

research proposal on student engagement

A balancing act: a window into online student engagement experiences

Orna Farrell & James Brunton

research proposal on student engagement

Online Learning Environments and Student Engagement: Meeting the Psychological Needs of Learners during the COVID-19 Pandemic

Avoid common mistakes on your manuscript.

1 Introduction

In recent years, student engagement has received increasing attention in educational research and practice (Aparicio et al., 2021 ; Wimpenny & Savin-Baden, 2013 ). Across all levels and phases of education, student engagement is considered a crucial factor for student learning. Research has shown that students who are more engaged with their studies and during class progress more in their studies and performance (Christenson et al., 2012 ; Fredricks et al., 2004 ; Trowler, 2010 ). Student engagement in higher education has also been extensively studied, including how it relates to the use of educational technology (Bond et al., 2020 ; Martin et al., 2020 ; Schindler et al., 2017 ).

Meyer ( 2014 ) pointed out that student engagement deserves special attention in online settings because of the diminished opportunities it offers students to engage with the institution, teachers, and peers. Whereas opportunities for study-related support and informal interaction between students and teachers have been shown to be crucial in promoting student engagement in in-person learning settings, such interactions are limited online (Redmond et al., 2018 ). This limitation seemed to be confirmed in 2020 when online teaching suddenly became the necessary means for educating students in the wake of the COVID-19 pandemic (Tartavulea et al., 2020 ; Watermeyer et al., 2020 ). Although some positive effects of online education were found, including better study results (Meeter et al., 2020 ), many studies reported lower student motivation for studying (Jensen et al., 2020 ; Means & Neisler, 2020 ; Stevens et al., 2020 ) and reduced student engagement in online education (Ali et al., 2020 ; Martin, 2020 ; Walker & Koralesky, 2021 ; Wester et al., 2021 ).

The omnipresence of online education during the COVID-19 pandemic also provided, however, a unique opportunity to gain more insight into factors that may promote student engagement in online settings. Since more teaching will likely take place online in the future, it is important to learn from the experiences during this period. Several studies described how online teaching during the pandemic affected student engagement in higher education (Ali et al., 2020 ; Martin, 2020 ; Roque-Hernández et al., 2021 ; Stevens et al., 2020 ; Walker & Koralesky, 2021 ; Wester et al., 2021 ). But little attention has been paid to what exactly affects student engagement during online learning activities and, more importantly, why student engagement is affected. The aim of the current study is to learn from students’ experiences by investigating what promotes their engagement in online learning activities and what underlying mechanisms inform those activities.

2 Theoretical Background and Overview of the Literature

2.1 student engagement.

The literature on student engagement is vast and has recorded many varying definitions of engagement over time (Martin et al., 2020 ). This has resulted in a substantial body of research that portrays engagement as a multidimensional concept (Christenson et al., 2012 ; Fredricks et al., 2004 ). Student engagement is often conceptualized along three dimensions: behavioral, cognitive, and affective engagement (Bond et al., 2020 ; Fredricks et al., 2004 ). These three dimensions of engagement are not ontologically distinct concepts but instead are interrelated (Fredricks et al., 2004 ).

Behavioral engagement is understood as effort and participation, or students’ involvement in learning activities (Fredricks et al., 2004 ). It is measured through observable behavior, such as whether students attend classes and do their homework. Affective engagement encompasses students’ attitudes towards their educational environment, such as teachers and peers. These attitudes affect students’ drive to engage in learning activities (Fredricks et al., 2004 ). Affective engagement includes students’ expectations, assumptions, commitment, and motivations for learning (Redmond et al., 2018 ); it is also associated with their sense of belonging to a community or institution and touches upon the emotional states that influence their motivation to learn (Mulrooney & Kelly, 2020 ; Redmond et al., 2018 ). Cognitive engagement refers to students’ deeper investment in and reflection on their learning process. It appears in students’ effort to understand materials and master skills, especially complex ones. Cognitive engagement addresses students’ involvement with study materials and their own learning process on a more abstract level (Fredricks et al., 2004 ).

2.2 Online Learning Activities

Online education or learning is understood as the delivery and reception of teaching through online platforms (Hodges et al., 2020 ; Means & Neisler, 2020 ). Online education can be either synchronous , taking place in real time, or asynchronous , involving pre-recorded materials that students watch on their own time (Tartavulea et al., 2020 ). Online learning activities include all educational activities that students participate in online, such as lectures, seminars, and small group meetings, as well as one-on-one supervision and asynchronous activities, such as contributing to an online discussion platform. Online exams are also considered online learning activities.

2.3 Promoting Student Engagement in Online Higher Education

There is a vast amount of research about what fosters student engagement in higher education, such as attendance, a feeling of belonging, and academic support (Christenson et al., 2012 ; Martin et al., 2020 ; Trowler, 2010 ). There is also ample research on what promotes student engagement in online learning. A systematic review of online learning research conducted by Martin et al. ( 2020 ) demonstrated that the largest number of studies focused on student engagement in online learning. Most research has investigated which online learning activities promote the different dimensions of engagement (Schindler et al., 2017 ); this research includes recent articles on specific strategies for fostering student engagement, such as identifying pedagogical touchpoints (Tualaulelei et al., 2022 ) or using learning analytics and nudging (Brown et al., 2022 ). In a literature review on the impact of different forms of computer-based technology on student engagement in higher education, Schindler et al. ( 2017 ) found that digital games, followed by web-conferencing and the use of Facebook, had the most influential and positive impact across all three dimensions of student engagement.

There is much less research about why and how online learning activities promote student engagement. Bond et al. ( 2020 ) systematically examined the research on student engagement and various forms of online learning in higher education, and they noted the lack of studies on the mechanisms that facilitate engaging online learning activities, as well as the lack of qualitative research in this area. Only a few recent studies have addressed mechanisms that may explain how online education can foster student engagement (Martin & Borup, 2022 ; Muir et al., 2019 ; O’Shea et al., 2015 ). Two qualitative studies showed that students’ engagement with online learning activities may be influenced by factors such as communication, responsiveness, and course design (O’Shea et al., 2015 ), as well as teacher presence (Muir et al., 2019 ). Mechanisms from these factors for engaging students in online settings include maintaining good contact between students and lecturers, acknowledging the online status of learners, and providing a clear structure to online students (Muir et al., 2019 ; O’Shea et al., 2015 ). Recently, Martin and Borup ( 2022 ) synthesized from the literature five online environmental factors that promote learner engagement, which resemble the factors identified previously by O’Shea et al. ( 2015 ) and Muir et al. ( 2019 ): communication, interaction, presence, collaboration, and community (Martin & Borup, 2022 ). However, the latter was a conceptualizing study, which leaves nearly non-existent any empirical literature on mechanisms underlying how online learning activities stimulate the different dimensions of engagement.

2.4 Aims of the Current Study

The current study aims to fill this gap in the research field by empirically exploring how student engagement is promoted in online education. In-depth qualitative research into students’ experiences is needed to understand what promotes student engagement in online learning settings and, more importantly, how it does this (Bond et al., 2020 ). To gain more insight into the mechanisms that underlie student engagement, the current study investigated the experiences that students at the University of Amsterdam had with online learning in 2020 and 2021, during the period when teaching at the university went online because of the COVID-19 pandemic. The following research question was investigated: What online learning activities did students experience that promoted their engagement in online education, and how did these activities promote their engagement? More specifically, this question was addressed through three sub-questions that correlate with the three major dimensions of engagement: What type of online learning activities promoted students’ (1) behavioral, (2) affective, and (3) cognitive engagement, and through which mechanisms?

3.1 Sample and Participants

To address the research questions, six online semi-structured focus groups were held on Zoom with students from different social sciences programs in the Faculty of Behavioral and Social Sciences (FMG: Faculteit Maatschappij & Gedrag) at the university. Participants were recruited through personal and university networks with the help of newsletters and faculty contacts. Therefore, the sampling method for this study can be regarded as convenience sampling. Initially, 29 students signed up for this study. Over the course of the study, there was one no-show, and three participants dropped out. This led to the sample for this study consisting of 25 Dutch-speaking students attending bachelor’s and master’s degree programs in the field of social sciences. Table 1 provides an overview of the composition of the focus groups, the participants’ background characteristics, and the FMG programs involved (other programs within this faculty are, e.g., sociology, psychology, and anthropology). Students from similar programs were assigned to the same focus group. Within the groups, there was variation in study level (i.e., bachelor’s or master’s degree students) and age (approximate range: 20–35 years). Most participants were female ( n  = 2 male participants). Recruitment criteria were as follows: (1) participants had to be a bachelor’s or master’s degree student at the university, (2) they had to be enrolled in a social sciences program in the FMG, (3) they had to speak Dutch, and (4) they had to have experience, as a student, with online education.

3.2 Procedure

This study was approved by the Ethics Review Board of the faculty. All participants took part in one semi-structured online focus group that lasted a maximum of 85 min (range: 77–85 min). The size of the focus groups ranged from three to five participants.

Participants received an e-mail that contained an information letter about the research project, a link to an online informed consent form, and a link to a brief background questionnaire in Qualtrics that asked about their study program, whether they were full-time or part-time students, their study year, and their prior experiences with online education. Participants were also requested to indicate their availability on an online schedule. Both one week and one day prior to the focus group, students received a reminder, including the link to the Zoom meeting and practical information. All focus groups took place online via Zoom, and audio and video were recorded with the consent of the participants. The personal data that was collected about individual participants was subsequently disconnected from what they said in the focus groups, to ensure that participants were less likely to be traceable based on their contributions.

To evaluate the focus group protocol, a pilot focus group was conducted with four students. This led to no major changes. The scripted procedure for the focus groups was as follows: After a walk-in of ten minutes and a short round of introductions, a warming-up activity was performed that asked participants to think of one keyword that described online education for them. After this warm-up, a more substantive discussion was kicked off when participants were briefly introduced to the Zoom whiteboard tool and then asked to write on it positive and consecutively negative experiences regarding their engagement. The question about positive experiences was “What activities stimulated your engagement during online learning activities?” and the question about negative experiences was “What activities diminished your engagement during online learning activities?” After participants used their answers to discuss their online experiences, they were asked about their preferences for future education, especially concerning online learning, by responding to the following short scenario:

Imagine that the Corona crisis is over and we are back to normal. Classes can take place on campus again, but due to Corona, we also have more experience with online education. According to you, what aspects of online education should or must still have a place in future education in view of promoting student engagement?

This activity was added to provoke more thoughts from students about engaging in online educational activities rather than to identify their wishes and preferences for the future. After they responded to the scenario, the session was wrapped up. No focus groups deviated significantly from the protocol.

3.3 Data Analysis

All recordings were transcribed by the researcher. Pseudonyms were used to respect the privacy of participants. Pilot data was also used for the analyses. The transcribed materials were coded, first deductively and then inductively, and they were analyzed by the first author using the software program ATLAS.ti (Version 8). Table 2 provides an overview of the coding scheme and the different phases of coding that were performed. The first phase of deductive coding identified the engaging activities that the students mentioned (see column 3 and 4 in Table 2 for these activities). In a second deductive phase, these activities were coded as contributing to one or more of the three dimensions of student engagement spelled out by Fredricks et al. ( 2004 ), and within these dimensions the activities were coded as either synchronous or asynchronous (in Table 2 , the dimensions of engagement appear in column 1 and the synchronous and asynchronous activities appear in columns 3 and 4). The third, inductive phase of coding categorized the coded activities (within each dimension of engagement) into types of online learning activities by identifying the underlying mechanisms that made the activities engaging (see column 2 in Table 2 ). During this coding phase, the second author was continuously consulted to ensure reliability.

This section presents the main findings of the study by discussing the different activities that promoted participants’ engagement in online learning activities. For each dimension of engagement—(1) behavioral, (2) affective, or (3) cognitive—online learning activities are addressed and organized according to their engagement-promoting mechanisms (see Table 2 for an overview).

4.1 Behavioral Engagement

4.1.1 activities that promote attention and focus.

Remaining behaviorally engaged in online education was challenging for some of the students in the study, as they found it hard to discipline themselves and to develop a daily work structure and rhythm without being physically present at the university. Nonetheless, several students in the focus groups indicated various synchronous online learning activities that promoted their behavioral engagement by stimulating their attention and focus. In particular, several students mentioned that small interactive group assignments in breakout rooms kept them focused and attentive during lectures. Breakout rooms in general helped to stimulate them, as one participant (P4) indicated: “Those breakout rooms showed that you had been listening for a while and that you could do something with a small group. That can really activate you, at least me.” Another student also described how interactive activities kept them focused: “I am very easily distracted online and especially when something interactive is used, you are drawn back to it” (P1).

Informal activities were also mentioned by some students as helping to gain their attention at the start of a lecture, rather than starting with formalities or dry theory:

I once had a lecture that started with a picture with all kinds of animals and moods on it, asking us like which animal do you feel today. You know, a nice introduction that everyone can laugh about and then you are instantly a little more involved in the lecture than when you are immediately pulled into the theory. (P4)

Besides interactive and informal activities, clear communication of goals and structure was mentioned by several students as important in promoting their behavioral engagement: it made them aware of the structure of the course, which bolstered their attention and focus for a while. Multiple students also mentioned sufficient breaks as important for keeping their attention and focus. Lastly, the teacher asking questions, giving turns to students, and encouraging discussion all promoted behavioral engagement, according to most students in the study, because these activities kept students on their toes.

4.1.2 Activities That Stimulate Effort

Several online learning activities stimulated students to put effort into learning tasks. These activities include students giving presentations or making a pitch after working on a group assignment in a breakout room. Time spent working on assignments online with peers also stimulated effort for several students. Sometimes the teacher in these Zoom sessions was present to answer questions, which could occur during or after a lecture (“sticking around”) or reserved for a separate time by the teacher (“walk-in moments”). According to students, not only did these Zoom sessions help them put effort into working on assignments and completing them; they also made it easier to ask for help and ask questions that could not be asked during lectures. Overall, online learning activities made the interviewed students more actively involved with their course work, which in turn affected their cognitive engagement with the course.

Several asynchronous activities also stimulated effort. For instance, watching pre-recorded videos and contributing to online discussion boards, which appear on the Canvas learning management system (LMS), made students put more effort into participating both before class and during lectures. Some of the participating students also mentioned that compulsory preparatory assignments got them acquainted with the materials prior to lectures, engaging them beforehand and leading to their greater engagement during lectures.

4.1.3 Activities That Break Barriers

Behavioral engagement was also promoted by activities that break barriers: i.e., activities that lower the barrier for students to speak, participate, and contribute to discussions. Online learning environments can inhibit some students from participating, but online polls, like Kahoot or Mentimeter, made it easier and more comfortable for students to start talking. As one student noted: “Those polls often give some kind of push to talk about things […]. That steppingstone to start talking and tell something, which for some might be just a bit too much to do in one go” (P5).

Some students noted that activities that generated discussion and conversation helped make them feel comfortable talking. One student described how this process worked: “If a discussion is really being provoked, the conversation just gets going a bit and then you have already heard each other's voices once, which makes it a little easier the next time to be able to start a discussion again” (P1). Several students also noted how the barrier to speaking was lower in smaller groups when they were required to unmute and turn their cameras on. This setting also engaged more silent or withdrawn students; as one student described: “Sometimes, people are less likely to say something when you first have to unmute, which means the same people talk all the time” (P10). Other students also mentioned that peers having their cameras turned on helped them stay involved and engaged instead of searching for distractions or tuning out.

Lastly, teachers giving turns to students helped break barriers to their speaking, as this student explained: “I just think it works better if a teacher […] really gives turns or pays attention to who has already said a lot and whom hasn't yet, because […] often there are students who really have something to say, but I think they are not comfortable to do so at that moment” (P17).

4.1.4 Activities That Provide Flexibility

Most of the students in the study saw as an advantage of online education the flexibility it gave them to choose the location where they followed lectures. By letting students attend lectures more easily and often, this flexibility stimulated behavioral engagement. Also, meeting with teachers online, such as for thesis supervision, was considered more convenient than meeting on campus. Several students also mentioned that teachers were more accessible through Zoom.

Asynchronous activities, such as pre-recorded videos and lectures, were mentioned by most students as giving them the chance to experience their education at their own time, pace, and location. For example, one student indicated how she preferred to listen to lectures online and in her own time: “I'm actually never present at a live lecture, because I just like getting my notes right the first time by being able to rewind a bit if I cannot hear it” (P25). Several students also noted that meeting with peers to work on course assignments became easier and more manageable online, especially in larger groups. Several students in the study found that pre-recorded lectures were useful for preparing for exams and gave them the freedom to follow the lecture at their own time and pace. Students in the focus group especially liked recorded materials in addition to lectures that could complement other study materials, such as readings and contact moments. Depending on their personal preferences, students’ use of pre-recorded lectures have different effects on their behavioral engagement.

4.2 Affective Engagement

4.2.1 activities that promote a group feeling.

Online learning activities promoted the affective engagement of many of the study’s participants by helping to create a group feeling, even though the activities were often not intended to have that effect. For example, having cameras on and being unmuted in small groups created less anonymity, according to some students, and gave them a sense of being together. Fun introductory and informal activities were also mentioned as stimulating affective engagement, much as they stimulated behavioral engagement. An example of such an activity is the “million-dollar question,” described by one of the interviewed students:

You had to design and briefly explain a research proposal that you had never investigated before, which was not really up your alley, but which you thought was very important to investigate. Then you introduce yourself based on that, on why you thought it was important, so you got to know each other more personally right away […]. This way, you also remembered fellow students much better, making it easier to talk to each other or send a message. (P10)

Similarly, several participants noted that online meetings that were organized to have students work together not only made students put effort into an assignment but also gave them a feeling of being part of a group. One student also considered it important to have the teacher check on the students: “A teacher who often asked, 'How was your weekend?' at the beginning of the lesson, and then he asked this to a number of people, which makes it seem as if you create some kind of bond within the group” (P21). This bonding was indicated by another student as an important goal for the teacher in online settings: “As long as things remain online, a teacher’s role should also be that of a connector” (P20).

Some asynchronous activities also promoted a group feeling that fostered affective engagement for some of the interviewed students. For a feeling of belonging to the course, several students valued WhatsApp groups that included both teachers and fellow students, while others thought that Canvas promoted affective engagement, provided that the teacher used it well: “It really depends on how the teacher uses it [Canvas], because the moment you take it seriously and a teacher also provides feedback in it […]. Then I think it's a nice tool, also when other people can respond to each other” (P6).

4.2.2 Activities That Encourage Interaction

According to the participants, affective engagement was also promoted by activities that stimulate interaction. The chat function was mentioned as a means for students to reach out to the teacher and each other, as pointed out by one student: “Through the chat, I have seen that people are brought together because someone asks a question” (P6). However, some students indicated that the chat or polls could also decrease interaction, especially when cameras are turned off:

If you work a lot with polls or via the chat, […] all questions are only asked in the chat at some point […]. To me, that feels like less interaction […]. When those cameras are off, from what I experienced, the chat is also used more often, making it [interaction] drop even further. (P7)

Some students mentioned that their feeling of being connected to peers was also boosted by using breakout rooms. Breakout rooms created moments to engage and interact with peers and to get to know each other. They also facilitated new contacts: participants were sometimes paired in breakout rooms with students they had not previously met, whereas on campus they would generally interact with students they already knew. An exemplary quote from one student illustrates this:

I started a minor program last semester and I ended up in a group of people I didn't know at all […] and I noticed that the online environment […] especially forced by those breakout rooms to talk to people, so you “have” to, you can't hide and withdraw yourself […]. So, for me it helped to make contact in a whole new environment. (P22)

Some students in the focus groups commented on how important it is to promote informal interaction in an online setting. Whereas informal interaction happens naturally in real-life settings, it must be encouraged in the online environment, as one student pointed out: “Now [during the Covid-19 pandemic], the attention is very much focused on ordinary informal contact with each other, and this was not the case before because it was self-evident” (P18). This informal contact is exactly what fueled affective engagement for some of the students in the study.

In terms of the teacher’s role, most interviewed students agreed that teachers should facilitate contact and interaction between students in an online environment. As an example of this, students mentioned mentor groups as encouraging open discussions among students and facilitating informal interaction. The students in the study were generally positive about mentor groups, as suggested by this comment: “A mentor group is really kind of a summary of all the coffee breaks between lectures, but then in one hour” (P7). This shows how mentor groups may fulfill a different function in online education than in regular on-campus education. Some students in our study who did not experience mentor groups in their classes expressed a desire for them.

4.2.3 Activities That Create a Sense of Empathy and Trust

Several students also considered activities that create a sense of empathy and trust as promoting their affective engagement. The fun and informal activities mentioned above stimulated an overall feeling of togetherness with the group and teachers. One student gave an example of how a teacher engaged students during the break:

During the break, she [the teacher] had a special PowerPoint slide reading: “count the number of people you see outside” […] or she had written down certain stretching exercises […]. Anyway, very thoughtful and compassionate to our needs, showing involvement or that they [the teacher] had really made an effort. (P13)

Another student described how a teacher created “little moments of happiness” (“geluksmomentjes”):

That [teacher] could really say, “don't touch the paper for a while, enjoy the snow” […] because of this, you also build a kind of personal bond with the teacher […]. Yes, very small simple things, making you see that the teacher is also human, so kind of the person behind the teacher. (P19)

These activities helped these students feel appreciated by teachers, leading them to feel more engaged in return. By asking them questions, teachers not only promoted behavioral engagement but activated a sense of affective engagement in these students. It made them feel valued by the teacher and comfortable enough to participate and contribute to the meeting.

The contributions of some of the interviewed students indicate that empathy and trust refer to students’ feelings of understanding and being heard not only by teachers but also by the institution. Asynchronous online activities initiated by the institution or teachers promoted affective engagement, according to some students. Among these activities were supportive e-mails and clear communication about online exams and assignments. Apart from the communication itself, the attitude or tone that accompanied the communication from teachers and the institution was important for creating trust, according to some students. This supportive tone includes flexibility in deadlines and schedules and not having a feeling of hierarchy, as this student pointed out: “You are all on one level, instead of the hierarchy portraying that the teacher is 'there,' on that side of the room, and we are all here, on ‘this’ side of the room. I think this makes you understand each other better in terms of how you are all in this [Covid-19] situation together” (P24). Across the focus groups, there were several students that appreciated the effort from teachers and felt great sympathy from and towards them when they actively tried to empathize with students.

4.3 Cognitive Engagement

4.3.1 activities that generate discussion.

Several students highlighted their cognitive engagement as being stimulated by activities that generate discussion. Many of those activities have already been discussed, as they also were mentioned as promoting behavioral engagement: online polls, which generated a deeper discussion about the results; unmuting students in group work, which created more room for in-depth discussions since students were less inclined to stay silent; and breakout rooms, where students could talk to each other freely and unmuted in small groups.

Several online learning activities that promoted affective engagement also fostered cognitive engagement. For example, several students mentioned that informal moments such as coffee breaks or catching up at the start of a lecture could set the tone of the meeting and potentially stimulate more in-depth or high-quality conversations. As one student noted:

A teacher encouraging personal conversations […] made it much more relaxed to go into class and during coffee breaks, that you do not talk about your studies but can simply get to know each other. I do have the idea that afterwards, it was very conducive to the discussion during a lecture. (P18)

When teachers planned online feedback moments for assignments, several students felt more engaged not only with peers and the teacher but also with the assignment itself, pointing towards cognitive engagement. As this student described: “The more feedback moments you have, the more you are involved in an assignment […]. Because of that, you really have the idea that there is much more of a learning curve” (P17).

A teacher who asks open questions, gives turns, and encourages discussion promoted both behavioral and cognitive engagement for some students. Asking interesting questions provoked more discussion and input from students and got more students to interact, as this student pointed out:

I think it really depends on a lecturer knowing how to use the digital environment […]. I do not think it has to be the digital environment as such, but that it depends on a lot of factors, including a teacher who knows well how to engage everyone and who can make it interesting. (P22)

Several asynchronous activities could also generate discussion, according to some students in the study. Pre-recorded videos about the basic theory made room for more room for discussion and going into depth with the materials during lectures. And watching pre-recorded videos such as micro lectures could make room for discussion during lectures, thereby promoting cognitive engagement. As one student described:

With those micro lectures, I really had the idea that the teacher recorded a video beforehand and then you felt much more confident about the subject that you will discuss afterwards. You can think about it for a while, and then you can really have a discussion of good quality. Then everyone also participates better. (P18)

4.3.2 Activities That Personalize

According to students’ contributions, several asynchronous activities involved personalization to stimulate students’ cognitive engagement before online lectures took place. For instance, pre-recorded videos enabled some students to better customize education to their own needs and preferences. These students noted that the videos were especially useful for students with little prior knowledge about a course and the theories taught, for students who sought further explanation, or for students who wished to immerse themselves in the study materials. The videos also invoked deeper learning in students, as one student explained:

A knowledge clip [“kennisclip”] is posted online, this is already background, so you can just catch up with this yourself […]. And also for myself, because if there is something that I no longer recognize or do not know anymore, that you can just quickly watch a knowledge clip about something that covers the basics. (P6)

5 Discussion

In this study, we identified learning activities and their underlying mechanisms that promote behavioral, affective, and cognitive student engagement in online learning environments. Behavioral engagement was found to be enhanced through the mechanisms of promoting attention and focus , stimulating effort , breaking barriers , and providing flexibility . For most of the students, affective engagement was stimulated through the mechanisms of promoting a group feeling, promoting interaction , and creating a sense of empathy and trust . Finally, for the students in our study, cognitive engagement was promoted through the mechanisms of generating discussion (in synchronous activities) and personalizing (in asynchronous activities). Several activities were identified that trigger these mechanisms.

The current study addressed several gaps that were identified in previous literature (Bond et al., 2020 ; Salas‐Pilco et al., 2022 ). First, it gave insight into the mechanisms that make online learning activities engaging rather than focusing on the engaging activities themselves. Previous studies have pointed out that not much is known about why and how online learning activities can foster student engagement (Bond et al., 2020 ). Particularly, whereas mechanisms of affective engagement have received some attention (Martin & Borup, 2022 ; Muir et al., 2019 ; O’Shea et al., 2015 ), mechanisms of behavioral and cognitive engagement in online learning have until now remained underexplained. This focus on mechanisms is important, as it has been shown that how students experience learning activities can differ, for example, as a result of context (Huang & Wang, 2023 ; Martin & Borup, 2022 ). A focus on underlying mechanisms sheds light on more general principles that may not depend as much on preference and context.

Second, we used qualitative data in the form of focus groups to investigate these underlying mechanisms, answering a call in the literature for qualitative studies of student engagement in online education (Bond et al., 2020 ). Our methodology has resulted in detailed insights into what makes online learning activities engaging to students, which might not have been uncovered with quantitative research.

Our findings are in line with previous research that has emphasized that interaction and collaboration between students are particularly important for student engagement in online settings (Muir et al., 2019 ). Our research showed that students indeed appreciated activities that encouraged interaction and promoted a group feeling, such as fun introductory activities or an active discussion board on Canvas. Our findings also suggest that these activities do not occur as a matter of course, which concurs with previous findings that collaboration and interaction are hard to achieve in online education (Dumford & Miller, 2018 ; Meyer, 2014 ; Redmond et al., 2018 ). Our focus groups revealed similar mechanisms as Martin and Borup ( 2022 ) suggested, based on their review of existing literature on online learner engagement. They proposed a framework that distinguishes communication, interaction, presence, collaboration, and community as mechanisms for promoting student engagement in online settings. Mechanisms that we found in our study—especially for promoting students’ affective engagement, such as activities that create a group feeling and interaction or activities that create a sense of empathy and trust—show considerable similarities with what Martin and Borup ( 2022 ) found in their review study.

Our study both confirms and challenges the value of distinguishing student engagement into multiple dimensions (Fredricks et al., 2004 ). On the one hand, in analyzing how learning activities affected engagement, we found that there was often a significant overlap in activities that were considered to enhance the different dimensions of engagement. Online learning activities that stimulated behavioral or affective engagement sometimes also promoted cognitive engagement, for instance by tapping into a form of deeper learning that challenged students’ thinking. An example of this overlap can be found in the use of breakout rooms, which stimulated all three dimensions of engagement. On the other hand, we also found that the underlying mechanisms that make activities engaging for students are specific to one of the three dimensions. Working in breakout rooms stimulated behavioral engagement by supporting students’ attention and focus, it fostered affective engagement by enabling their interaction, and it promoted cognitive engagement by generating discussions. Our findings are thus in line with the three-dimensional engagement model (Fredricks et al., 2004 ), even as it underlines the interrelatedness of the dimensions that this model distinguishes.

It may be disputed whether the activities that we identified as stimulating engagement, along with the mechanisms that made those activities engaging, are specific to online education. However, several learning activities that were found to promote student engagement in our study are indeed distinctive for online education because they are only possible through the use of online tools, such as breakout rooms that were reported to stimulate all dimensions of engagement. Other engagement-enhancing activities can be considered specific to online education because the online setting requires extra effort compared to on-campus education. Some activities that were mentioned as promoting engagement and that seemed specific to the online setting were not necessarily related to learning. Institutional presence and support, for example, were important for students’ feelings of belonging to the institution. Thus, through the mechanism of sense of empathy and trust, regular and clear communication were crucial factors for engaging students online and maintaining their distinctive status as online learners, as O’Shea et al. ( 2015 ) and Muir et al. ( 2019 ) also observed.

6 Conclusion

6.1 major findings.

This study has shed light on online learning activities that teachers can use to enhance students’ behavioral, affective, and cognitive engagement, and identified mechanisms that explain how these activities stimulate students’ engagement. Online activities that stimulated engagement took place both asynchronously and synchronously. Although there was an overlap in online learning activities that were found to enhance the different dimensions of engagement, the underlying mechanisms that make activities engaging for students appeared to be specific to one of the three dimensions, thus shedding light on general principles of promoting engagement that may be less dependent on preference and context.

6.2 Limitations

This study has several limitations. First, the study focused on students’ online learning experiences during the COVID-19 pandemic. These experiences likely differed from students’ experiences with online learning in regular times. The abrupt transition to online education due to the pandemic has been characterized as emergency remote teaching (Hodges et al., 2020 ; Tartavulea et al., 2020 ; Watermeyer et al., 2020 ), where students who had not chosen to study online suddenly had to do so. Students were also taught by teachers who mostly had not been trained to teach online. Apart from this sudden shift to online education, daily life also changed drastically because of the COVID-19 pandemic. Students’ engagement with learning was likely affected by more than just the fact that teaching moved online; students also had to deal, for instance, with the social isolation they experienced during the lockdowns. Addressing these additional factors, however, went beyond the scope of our research. Although they should be considered when drawing lessons from our study for “normal” online learning, we think that our focus on the underlying mechanisms that make online learning activities engaging (the how ) has enabled us to suggest some general principles.

Another limitation of our study is that it focused on Dutch students, whereas international students may have had different experiences engaging online education at the University of Amsterdam. Similarly, experiences at the university may differ at faculties outside of the social sciences. Nevertheless, we believe that our findings are not unique to Dutch social sciences students, as these students are not different from other students in ways that are significant to our research question. We therefore think that educators and researchers in non-Dutch countries can benefit from our study. Despite its small sample size and specific population, we think that out study sheds light on the mechanisms that are involved in engaging students in online education.

A final limitation of this study is that it was based on the traditional three-dimensional student engagement model. We did not focus on additional dimensions of student engagement in online learning that have been suggested in the literature, such as agentic engagement (Chiu, 2022 ) and social engagement (Redmond et al., 2018 ).

6.3 Implications

This study has implications for theory, future research, and educational practice. Results highlighted the interrelatedness of the dimensions of engagement, whereas in previous research they have often been conceptualized separately (Fredricks et al., 2004 ). This could imply a need for a paradigm shift, where connections between dimensions of engagement are sought more actively. Other dimensions of engagement, such as agentic engagement (Chiu, 2022 ) and social engagement (Redmond et al., 2018 ), could be considered as part of this construct. We would recommend that future qualitative and quantitative research look into online learning activities include these dimensions of engagement, since they could further clarify how the mechanisms behind online learning activities work to promote student engagement, as well as consider their interrelatedness. Also, we would like to encourage future researchers to further investigate the effectiveness of these and other types of online learning activities in promoting student engagement. We hope that our findings will provide inspiration for future research involving other populations of students, a larger sample size, and time periods that are not warped by a pandemic.

Whereas research at the start of the Covid-19 pandemic focused on how the transition to online learning affected student engagement (Stevens et al., 2020 ), recent studies have sought to improve future online and blended forms of education by examining the best online education practices for promoting student engagement (McKeithan et al., 2021 ). Teachers—who play a key role in stimulating student engagement in online classes—can benefit from insights into learning activities that promote student engagement in online settings, as well as from insights into the mechanisms that inform these activities (Demedts et al., 2015 ; Meij et al., 2021 ). Our study suggests how teachers may use certain digital tools and online learning activities to stimulate the different dimensions of student engagement. Also, the mechanisms foregrounded in our study can help teachers to choose those tools and activities by clarifying what students need in order to be engaged online.

Availability of Data and Materials

The data generated and analysed for this study is not publicly available due to privacy reasons and the need to guarantee the anonymity of its participants.

Ali, I., Narayan, A. K., & Sharma, U. (2020). Adapting to COVID-19 disruptions: Student engagement in online learning of accounting. Accounting Research Journal . https://doi.org/10.1108/ARJ-09-2020-0293

Article   Google Scholar  

Aparicio, G., Iturralde, T., & Maseda, A. (2021). A holistic bibliometric overview of the student engagement research field. Journal of Further and Higher Education, 45 (4), 540–557. https://doi.org/10.1080/0309877X.2020.1795092

Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: A systematic evidence map. International Journal of Educational Technology in Higher Education, 17 (1), 1–30. https://doi.org/10.1186/s41239-019-0176-8

Brown, A., Lawrence, J., Basson, M., & Redmond, P. (2022). A conceptual framework to enhance student online learning and engagement in higher education. Higher Education Research & Development, 41 (2), 284–299. https://doi.org/10.1080/07294360.2020.1860912

Chiu, T. K. F. (2022). Applying the self-determination theory (SDT) to explain student engagement in online learning during the COVID-19 pandemic.  Journal of Research on Technology in Education , 54 (sup1), S14–S30.  https://doi.org/10.1080/15391523.2021.1891998

Christenson, S. L., Reschly, A. L., & Wylie, C. (2012). Handbook of research on student engagement . Springer Science & Business Media.

Book   Google Scholar  

Demedts, L., Raes, F., Spittaels, O., Lust, G., & Van Puyenbroeck, H. (2015). De docent als sleutelfiguur bij blended learning. TheMa, 1 (1), 23–28.

Google Scholar  

Dumford, A. D., & Miller, A. L. (2018). Online learning in higher education: Exploring advantages and disadvantages for engagement. Journal of Computing in Higher Education, 30 (3), 452–465. https://doi.org/10.1007/s12528-018-9179-z

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74 (1), 59–109. https://doi.org/10.3102/00346543074001059

Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (2020). The difference between emergency remote teaching and online learning. Educause Review, 27 , 1–12.

Huang, Y., & Wang, S. (2023). How to motivate student engagement in emergency online learning? Evidence from the COVID-19 situation. Higher Education, 85 (5), 1101–1123. https://doi.org/10.1007/s10734-022-00880-2

Article   PubMed   Google Scholar  

Jensen, L. X., Karstad, O. M., Mosbech, A., Vermund, M. C., & Konradsen, F. (2020). Experiences and challenges of students during the 2020 campus lockdown: Results from student surveys at the University of Copenhagen . University of Copenhagen.

Martin, L. (2020). Foundations for good practice: The student experience of online learning in Australian higher education during the COVID-19 pandemic . Australian Government Tertiary Education Quality and Standards Agency (TEQSA). https://www.teqsa.gov.au/sites/default/files/student-experience-of-online-learning-in-australian-he-during-covid-19.pdf?v=1606442611

Martin, F., & Borup, J. (2022). Online learner engagement: Conceptual definitions, research themes, and supportive practices. Educational Psychologist, 57 (3), 162–177. https://doi.org/10.1080/00461520.2022.2089147

Martin, F., Sun, T., & Westine, C. D. (2020). A systematic review of research on online teaching and learning from 2009 to 2018. Computers & Education, 159 , 104009. https://doi.org/10.1016/j.compedu.2020.104009

McKeithan, G. K., Rivera, M. O., Mann, L. E., & Mann, L. B. (2021). Strategies to promote meaningful student engagement in online settings. Journal of Education and Training Studies, 9 (4), 1–11. https://doi.org/10.11114/jets.v9i4.5135

Means, B., & Neisler, J. (2020). Suddenly online: A national survey of undergraduates during the COVID-19 pandemic . Digital Promise. https://digitalpromise.dspacedirect.org/bitstream/handle/20.500.12265/98/DPSuddenlyOnlineReportJuly2020.pdf?sequence=3

Meeter, M., Bele, T., den Hartogh, C., Bakker, T., de Vries, R. E., & Plak, S. (2020). College students’ motivation and study results after COVID-19 stay-at-home orders . https://doi.org/10.31234/osf.io/kn6v9

Meij, M., Pareja Roblin, N., Van Dorresteijn, C., Voogt, J., Cornelissen, F., & Volman, M. (2021). Online Onderwijs op de UvA tijdens COVID-19: Didactische Strategieën om Sociale en Cognitieve Processen te Ondersteunen . Onderzoeksteam ‘Online onderwijs tijdens COVID-19’. Universiteit van Amsterdam.

Meyer, K. (2014). Student engagement in online learning: What works and why. Higher Education Report, 40 (6), 1–14. https://doi.org/10.1002/aehe.20018

Article   ADS   Google Scholar  

Muir, T., Milthorpe, N., Stone, C., Dyment, J., Freeman, E., & Hopwood, B. (2019). Chronicling engagement: Students’ experience of online learning over time. Distance Education, 40 (2), 262–277. https://doi.org/10.1080/01587919.2019.1600367

Mulrooney, H. M., & Kelly, A. F. (2020). Covid 19 and the move to online teaching: Impact on perceptions of belonging in staff and students in a UK widening participation university. Journal of Applied Learning and Teaching, 3 (2), 1–14. https://doi.org/10.37074/jalt.2020.3.2.15

O’Shea, S., Stone, C., & Delahunty, J. (2015). “I ‘feel’ like I am at university even though I am online.” Exploring how students narrate their engagement with higher education institutions in an online learning environment. Distance Education, 36 (1), 41–58. https://doi.org/10.1080/01587919.2015.1019970

Redmond, P., Abawi, L. A., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online Learning, 22 (1), 183–204. https://doi.org/10.24059/olj.v22i1.1175

Roque-Hernández, R. V., Díaz-Roldán, J. L., López-Mendoza, A., & Salazar-Hernández, R. (2021). Instructor presence, interactive tools, student engagement, and satisfaction in online education during the COVID-19 Mexican lockdown. Interactive Learning Environments . https://doi.org/10.1080/10494820.2021.1912112

Salas-Pilco, S. Z., Yang, Y., & Zhang, Z. (2022). Student engagement in online learning in Latin American higher education during the COVID-19 pandemic: A systematic review. British Journal of Educational Technology, 53 (3), 593–619. https://doi.org/10.1111/bjet.13190

Article   PubMed   PubMed Central   Google Scholar  

Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: A critical review of the literature. International Journal of Educational Technology in Higher Education, 14 (1), 1–28. https://doi.org/10.1186/s41239-017-0063-0

Stevens, T., Ben Brok, P., Biemans, H., & Noroozi, O. (2020). The transition to online education: A case study of Wageningen University & Research . https://www.4tu.nl/cee/innovation/project/13042/the-transition-to-online-educationduring-the-corona-crisis-situation

Tartavulea, C. V., Albu, C. N., Albu, N., Dieaconescu, R. I., & Petre, S. (2020). Online teaching practices and the effectiveness of the educational process in the wake of the COVID-19 pandemic. Amfiteatru Economic, 22 (55), 920–936. https://doi.org/10.24818/EA/2020/55/920

Trowler, V. (2010). Student engagement literature review. The Higher Education Academy, 11 (1), 1–15.

Tualaulelei, E., Burke, K., Fanshawe, M., & Cameron, C. (2022). Mapping pedagogical touchpoints: Exploring online student engagement and course design. Active Learning in Higher Education, 23 (3), 189–203. https://doi.org/10.1177/1469787421990847

Walker, K. A., & Koralesky, K. E. (2021). Student and instructor perceptions of engagement after the rapid online transition of teaching due to COVID-19. Natural Sciences Education, 50 (1), e20038. https://doi.org/10.1002/nse2.20038

Article   PubMed Central   Google Scholar  

Watermeyer, R., Crick, T., Knight, C., & Goodall, J. (2020). COVID-19 and digital disruption in UK universities: Afflictions and affordances of emergency online migration. Higher Education, 81 , 623–641. https://doi.org/10.1007/s10734-020-00561-y

Wester, E. R., Walsh, L. L., Arango-Caro, S., & Callis-Duehl, K. L. (2021). Student engagement declines in STEM undergraduates during COVID-19—Driven remote learning. Journal of Microbiology & Biology Education, 22 (1), 1–11. https://doi.org/10.1128/jmbe.v22i1.2385

Wimpenny, K., & Savin-Baden, M. (2013). Alienation, agency and authenticity: A synthesis of the literature on student engagement. Teaching in Higher Education, 18 (3), 311–326. https://doi.org/10.1080/13562517.2012.725223

Download references

Acknowledgements

I would like to thank all the students for participating in the focus groups for this study and the people who helped me with recruiting the participants. Also, I would like to specifically thank the research team ‘Online education during COVID-19’ for giving me the chance to be a part of their team and thinking along with me in the first stages of my research.

Not applicable.

Author information

Authors and affiliations.

Research Institute of Child Development and Education, Educational Sciences, University of Amsterdam, Amsterdam, The Netherlands

Emma J. Vermeulen & Monique L. L. Volman

You can also search for this author in PubMed   Google Scholar

Contributions

E. J. Vermeulen wrote the full manuscript, collected the data, and analysed the data. M. L. L. Volman supervised this process and provided multiple rounds of feedback, and did suggestions for writing the manuscript.

Corresponding author

Correspondence to Emma J. Vermeulen .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Focus group protocol (DOCX 35 KB)

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Vermeulen, E.J., Volman, M.L.L. Promoting Student Engagement in Online Education: Online Learning Experiences of Dutch University Students. Tech Know Learn (2024). https://doi.org/10.1007/s10758-023-09704-3

Download citation

Accepted : 07 November 2023

Published : 27 February 2024

DOI : https://doi.org/10.1007/s10758-023-09704-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Student engagement
  • Online education
  • Dutch higher education
  • Online focus groups
  • Find a journal
  • Publish with us
  • Track your research

research proposal on student engagement

Call for Proposals: IARSLCE 2024 Conference Proposals

The International Association for Research on Service Learning and Community Engagement is calling for proposals for their 2024 conference. The theme of the conference this year is "Intersections". Click here for more information.

Submissions are encouraged to connect to the theme Intersections and the following thematic areas:

  • Community partnerships and non-university stakeholders
  • Student outcomes at K-12, Undergraduate and Graduate levels
  • Emerging technologies in SLCE
  • Faculty issues (promotion and tenure, learning communities, etc.)
  • Theory and methods
  • Using SLCE to address Sustainable Development Goals
  • Identity and indigeneity

The George Washington University, Washington, DC

  • Campus Advisories
  • EO/Nondiscrimination Policy (PDF)
  • Website Privacy Notice
  • Accessibility
  • Terms of Use

GW is committed to digital accessibility. If you experience a barrier that affects your ability to access content on this page, let us know via the Accessibility Feedback Form .

This site uses cookies to offer you a better browsing experience. Visit GW’s Website Privacy Notice to learn more about how GW uses cookies.

Public Scholarship and Engagement

Public Scholarship and Engagement

Large mural on the back wall of Woodland High School's theater building featuring  a woman with her back against a man. The man is facing away and points to the right. The woman holds a bowl and pours water to a field. They are behind large books, music notes and piano keys. Surrounding them are children holding arms and dancings. The mural is colorful and vibrant.

Public Impact Grants Awarded for Community-University Partnerships

  • by Becky Oskin
  • April 17, 2024

The Office of Public Scholarship and Engagement is proud to announce the nine recipients of the 2024 Public Impact Research Initiative (PIRI) Grants, continuing its commitment to supporting research that creates meaningful community impacts. 

This year's funded projects exemplify the breadth and depth of public scholarship, ranging from efforts to mitigate wildfire risks in Maui; citizen science and environmental justice; advancing healthcare accessibility; and documenting stories of indigenous peoples and immigrants. Awardees were selected from 26 applicants across UC Davis.

"Representing a diverse range of disciplines and approaches, these projects are bound by a common goal — to catalyze change for the greater good,” said Michael Rios, vice provost for public scholarship.

Public Scholarship and Engagement established the PIRI grants program in 2019 to recognize and support research that is cogenerated with community partners, is of mutual benefit, and has a positive public impact. This nationally-recognized grant program supports innovative and impactful community engagement projects that might otherwise be overlooked or underfunded.

Diverse range of projects

The nine projects address pressing issues such as public health, food insecurity and climate change. Read on for a brief overview, and  visit the Public Scholarship and Engagement website for full descriptions and a list of all team members and community partners.

Cultivating Youth and Community Resilience in Maui, Hawaiʻi

Led by Heidi Ballard and Jadda Miller, this project engages with Kihei Charter School and Kipuka Olowalu in Maui for wildfire mitigation through land stewardship. It embodies a citizen science approach, emphasizing community participation in scientific research.

Toxic Air Pollutants in California's Environmental Justice Communities

Clare Cannon and Alex Sanchez's project focuses on deploying low-cost air quality sensors in the San Francisco Bay Area in partnership with Rise South City, highlighting the importance of community-engaged research in addressing environmental injustices. 

Improving Air Quality Awareness in Rural Communities

Ian Faloona and Heather Lieb are spearheading an initiative to raise awareness about emissions from agricultural soils and collaborating with organizations to enhance air quality in rural areas. Partners include Comité Cívico del Valle (CCV), the Central California Environmental Justice Network and the National Parks Conservation Association 

Stories of Agri-Cultural Justice from California to the Mississippi Delta

Erica Kohl-Arenas’ project with longtime collaborator Myrna Martinez-Nateras (Pan Valley Institute, American Friends Service Committee) explores food justice through the stories of immigrant, refugee, and indigenous women in California’s Central Valley, highlighting the power of cultural tradition in building sustainable futures.  

Advancing Tele-eye Care Accessibility

Yin Allison Liu, Oanh Meyer, and David Bissig aim to implement a tele-eye care program for individuals with cognitive impairments, demonstrating the potential of technology in bridging healthcare disparities. Their partners include Olleyes, Inc., Alzheimer's Association Northern California and Northern Nevada Chapter, Sacramento County Health Center.

Stewarding Future Veterinary Professionals

This project, led by Alexis Patterson Williams in partnership with Breakthrough Sacramento, focuses on diversifying the veterinary profession through education and outreach.

A Bilingual Documentary on COVID-19's Impact on Latine/x Immigrant Communities

Alicia Rusoja will produce a bilingual documentary that captures the profound effects of COVID-19 on immigrant communities in South Philadelphia, offering insights into their resilience and political mobilization.

Deferred Action for Labor Enforcement Research and Policy Initiative

Leticia Saucedo, Raquel Aldana, and Shayak Sarkar will collaborate with the Arriba Las Vegas Worker Center to provide legal and research assistance to undocumented workers in the U.S. labor market, testing the effects of Deferred Action for Labor Enforcement (DALE) protection on workplace advocacy.

Uncovering Diverse Histories of Yolo County

Cecilia Tsu and her team's partnership with the Yolo County Archives seeks to bring to light the untold stories of underrepresented communities in Yolo County, transforming historical research into educational curriculum for K-12 students.

Primary Category

IMAGES

  1. Choose from 40 Research Proposal Templates & Examples. 100% Free

    research proposal on student engagement

  2. What's The Importance Of Research Proposal

    research proposal on student engagement

  3. (DOC) Research Proposal about "Engaging in an Early Relationship is One

    research proposal on student engagement

  4. 7+ Student Project Proposal Templates

    research proposal on student engagement

  5. Choose from 40 Research Proposal Templates & Examples. 100% Free

    research proposal on student engagement

  6. 2024 Research Proposal Sample

    research proposal on student engagement

VIDEO

  1. Capstone Proposal Speech (COMM 4900)

  2. Writing a research proposal

  3. shocking proposal student 😂 viral #students #college #school #love #proposal #hostellife #viral #gym

  4. How to Write a Research Proposal & Student Writing Tips

  5. Tips to make your Research Proposal unique

  6. Introduction To Research Proposal Writing 1

COMMENTS

  1. Fostering student engagement with motivating teaching: an observation

    Introduction. Research shows that student engagement constitutes a crucial precondition for optimal and deep-level learning (Barkoukis et al. Citation 2014; Skinner Citation 2016; Skinner, Zimmer-Gembeck, and Connell Citation 1998).In addition, student engagement is associated with students' motivation to learn (Aelterman et al. Citation 2012), and their persistence to complete school ...

  2. Improving student engagement: Ten proposals for action

    The literature includes suggestions such as the development of proposals for improving student engagement (Zepke and Leach, 2010), community servicelearning (Mtawa et al., 2021), and summative co ...

  3. PDF Active learning classroom design and student engagement: An ...

    While student‐centered instruction can occur in any style classrooms, active learning classrooms (ALCs) are purposefully designed to promote student engagement in the learning process (Adedokum et al., 2107; Baepler et al., 2016; Freeman et al., 2014; Wiltbank et al., 2019).

  4. PDF Keep Learning: Student Engagement in an Online Environment

    The student sample was relatively young, with 316 (92.4%) in the age range of 18-25. Most faculty members (67 individuals or 77% of the sample) were between the ages of 36 and 64. Among 86 faculty members, 42 (48.3%) were female and 34 (46.0%) were male. Of the 342 students, 226 (66.1%) were female and 95 (27.8%) were male.

  5. Improving student engagement: Ten proposals for action

    Abstract. Since the 1980s an extensive research literature has investigated how to improve student success in higher education focusing on student outcomes such as retention, completion and employability. A parallel research programme has focused on how students engage with their studies and what they, institutions and educators can do to ...

  6. Engaging Young People in a Research Project: The Complexities and

    Student engagement as a concept may be overused and underdefined (Appleton et al., 2008), but the lack of it in secondary schools is still an important issue in different studies (see Fredricks et al., 2019; Patall et al., 2019).For many students the school has "become completely banal, meaningless and without purpose, except as a reasonably pleasant place in which to meet and socialize with ...

  7. Improving student engagement: Ten proposals for action

    Towards a more inclusive understanding of engagement. We have synthesized findings from four dominant research perspectives that illuminate student engagement in higher education, which is one indicator of student success. The synthesis has taken the form of ten proposals drawn from these research perspectives.

  8. Handbook of Research on Student Engagement

    The second edition of the handbook reflects the expanding growth and sophistication in research on student engagement. Editorial scope and coverage are significantly expanded in the new edition, including numerous new chapters that address such topics as child and adolescent well-being, resilience, and social-emotional learning as well as extending student engagement into the realm of college ...

  9. Improving student engagement: Ten proposals for action

    Improving student engagement: Ten proposals for action. Since the 1980s an extensive research literature has investigated how to improve student success in higher education focusing on student outcomes such as retention, completion and employability. A parallel research programme has focused on how students engage with their studies and what ...

  10. PDF Action Research and the Improvement of Student Engagement

    Our ability to effectively address the issue of student engagement is hampered by the fact that lots of recommendations, which tends to be generic, and not specific exist. As Trowler and Trowler (2010; 64) aptly stated, Student engagement is generally an area where research interest […] is sparked by a desire for enhancement.

  11. Improving Student Engagement

    To summarize improving student engagement, the themes and ideas that surface most. often in the literature are: embedded collaboration, integrated technology, inquiry-based. learning, assessment for learning, and making learning interdisciplinary and relevant to real life. As Barak and Doppelt (2002, p.

  12. Mapping research in student engagement and educational technology in

    Digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience. It has also been linked to an increase in behavioural, affective and cognitive student engagement, the facilitation of which is a central concern of educators. In order to delineate the complex nexus of technology and student engagement, this article systematically ...

  13. PDF RESEARCH REPORT Proposal for Change: Student Engagement in Turtle ...

    Teachers expressed that student engagement is multifaceted and is comprised of cognitive, affective, and behavioural components. Manitoba Education, Citizenship and Youth (2007) described cognitive engagement as student understandings about their learning, affective engagement as student feelings about school, and behavioural engagement as student

  14. Real‐life research projects improve student engagement and provide

    1.2. Engagement via practical classes. In the case of undergraduate science students, a teaching method that has been shown to increase student engagement is the use of practical classes (Charney et al., 2007).Practicals can take a range of forms, from teacher‐led demonstrations to "recipe‐style" activities to independent research projects (Dunlop et al., 2019).

  15. Relationships between student engagement and academic achievement: A

    Abstract. Most scholars have argued that student engagement positively predicts academic achievement, but some have challenged this view. We sought to resolve this debate by offering conclusive ...

  16. (PDF) Strategies for Online Student Engagement

    Research suggests that student engagement is a key determinant of students' learning effectiveness. More engaged students are expected to have positive academic and non-academic outcomes ...

  17. Promoting Student Engagement in Online Education: Online ...

    Student engagement is an important factor in higher education learning, but engaging students in online learning settings has been found to be challenging. Little research has been conducted yet into how online learning activities can engage students. In this study, students' experiences with online education were examined during the COVID-19 pandemic to find out what online learning ...

  18. A study of the relationship between students' engagement and their

    The findings of this research align with the existing body of work to establish that student engagement is an important factor that contributes to the success of students on online courses. However, there are different models of students' engagement based on the teaching and learning context and the preferred learning design when it comes to ...

  19. The Effects Of Technology On Student Motivation And Engagement In

    technology was introduced. One of the key findings in the literature on technology implementation is the power of. technology to engage students in relevant learning, in that the use of technology increases. student motivation and engagement (Godzicki, Godzicki, Krofel, & Michaels, 2013).

  20. The Impact of Online Learning on Student's Academic Performance

    FINAL RESEARCH PROPOSAL 1. Final Research Proposal. Muhammad Irfan 2484185 Lim Jia Xuan 2554985 Joachim Lee 2563007 Widyan Shahlan 2555008 ... has been attributed to a lack of student engagement, with the quality of faculty-student interactions and learning strategies, being among the variables with a high positive correlation (Dumford & Miller ...

  21. (PDF) Parental Involvement and Student Engagement: A ...

    Abstract and Figures. Although parental involvement is among the most crucial aspects of social support for students' school engagement and achievement, current review work on the relationship ...

  22. Call for Proposals: IARSLCE 2024 Conference Proposals

    The International Association for Research on Service Learning and Community Engagement is calling for proposals for their 2024 conference. The theme of the conference this year is "Intersections". Click here for more information. Submissions are encouraged to connect to the theme Intersections and the following thematic areas: Community partnerships and non-university stakeholders Student ...

  23. Public Impact Grants Awarded for Community-University Partnerships

    The Office of Public Scholarship and Engagement is proud to announce the nine recipients of the 2024 Public Impact Research Initiative (PIRI) Grants, continuing its commitment to supporting research that creates meaningful community impacts.. This year's funded projects exemplify the breadth and depth of public scholarship, ranging from efforts to mitigate wildfire risks in Maui; citizen ...

  24. Student engagement and mathematics achievement: Unraveling main and

    The University of Hong Kong. Abstract. The pr esent study interrogated the r elationships between student engagement and mathematics. achievement for 295,416 fifteen-year-old students from 11,767 ...