A Meta-Analytic Study of Social Presence in Higher Education Online Environments
Dr. David Mykota
Volume 40, Issue 1, 2025 - ISSN: 2292-8588
https://doi.org/10.55667/10.55667/ijede.2025.v40.i1.1351
Abstract: This study reports a systematic review and meta-analyses of the construct social presence in online higher education settings. The research objectives are to: 1) determine the overall impact of scale-based measures of social presence on student learning outcomes, and 2) determine the overall impact of scale-based measures of social presence on student satisfaction outcomes. A thorough examination of the research literature from 1995 to 2022 was conducted, employing a three-stage screening process to identify 53 studies suitable for inclusion in the meta-analyses. Utilizing a random effects model for analysis, the study investigated the two outcome measures with subgroup analysis. The results affirm that social presence has a moderate effect on both student satisfaction and learning outcomes, with no evidence of publication bias identified. In conducting a subgroup analysis to help explain some of the heterogeneity, significant effects were found for mode of delivery and for the scale-based instrument used. The paper concludes by advocating for enhanced
rigour in research design to facilitate empirically validated investigations into improving social presence in online learning environments.
Keywords: evidence synthesis, higher education, online learning, systematic review, meta-analysis, course design, teaching, technology, social presence
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
Resume : Cette etude presente une revue systematique et des meta-analyses portant sur le concept de presence sociale dans les environnements en ligne dans l'enseignement superieur. Les objectifs de recherche sont les suivants: 1) determiner l'impact global des mesures de la presence sociale, fondees sur des echelles, sur les resultats d'apprentissage des etudiants ; 2) evaluer l'impact global des mesures de la presence sociale, fondees sur des echelles, sur la satisfaction des etudiants. Une analyse rigoureuse de la litterature scientifique publiee entre 1995 et 2022 a ete menee, selon un processus de selection en trois etapes, permettant d'identifier 53 etudes pertinentes pour la meta-analyse. A l'aide d'un modele a effets aleatoires, deux types de resultats ont ete examines et des analyses de sous-groupes ont ete realisees. Les resultats mettent en evidence que la presence sociale a un effet modere sur la satisfaction et les resultats d'apprentissage des etudiants, sans preuve de biais de publication. La realisation d'une analyse de sous-groupe a revele des effets significatifs selon le mode de diffusion et l'instrument de mesure utilise. L'article conclut en soulignant la necessite d'une plus grande rigueur methodologique pour favoriser la validation empirique des recherches et l'amelioration de la presence sociale dans les environnements d'apprentissage en ligne.
Mots-cles : synthese de donnees, enseignement superieur, apprentissage en ligne, revue systematique, meta-analyse, conception de cours, enseignement, technologie, presence sociale
In education, the Internet has facilitated the development of online learning, which has grown in popularity in post-secondary institutions because of its ability to provide a flexible and accessible learning environment for students. However, replicating the social interactions experienced in a face-to-face environment in an online environment can be a complex undertaking. To help define and describe these social and interpersonal interactions, the construct of social presence has been advanced. Understanding the effects of social presence are necessary because social presence has been found to influence student learning, retention, and satisfaction in online courses (Mykota, 2018). Given that social interactions in education have been shown to be a fundamental element to learning, it is important to understand how social presence can be facilitated in higher-education online learning environs.
Research on social presence is both robust and varied. It has been found that social presence can be augmented and developed by purposeful online course design and instruction (Mykota, 2018). Although several individual studies have reported positive outcomes for students in their online learning experience because of social presence, the quantified pooled effects are not as well understood. This study is unique as it will allow for a systematic synthesis of the research to quantify the effects of social presence across studies. Thereby, the study helps us determine the relationship between social presence and student learning and satisfaction outcomes and how these effects are moderated. The purpose of this study is to systematically review and statistically summarize, through meta-analyses (that is, the study of studies), the research literature on social presence and its effects on student outcomes within higher education.
Social presence, characterized by individual and group communication within an online learning environment, is fundamental to how we present ourselves and interact with others. The social and interpersonal interactions that characterize the way individuals communicate, interrelate, and project themselves online is often described as social presence. Given that education inherently involves social interactions (Dewey, 1963; Hiltz, 1994; Hurst et al., 2013; Liaw & Huang, 2000), understanding how to effectively facilitate social presence is crucial.
Early research in online learning, referred to as computer-mediated communication, found that social presence was a strong predictor of learner satisfaction (Gunawardena & Zittle, 1997). Richardson and Swan (2003) confirmed that a student's social presence was related to their perceived learning and that social presence was also a strong predictor of student satisfaction within a text-based online environment. Since this initial research, there have been several other studies that acknowledge the positive relationship between social presence and learner satisfaction (Hostetter & Busch, 2006; Jaradat & Ajlouni, 2020; Richardson et al., 2017; Nasir, 2020; Zhan & Mei, 2013). A moderate positive relationship has also been reported to exist between social presence and student learning, and performance and cognitive absorption (Tan, 2021a; Tan, 2021b).
Hostetter and Busch (2013) noted students who demonstrated higher rates of social presence in their online discussion forums had statistically higher scores on their standardized achievement tests. When the relationship between course design, social presence, and academic achievement was investigated, it was found that course designs that facilitate meaningful interactions result in the development of social presence, which could positively impact academic performance (Joksimovic, et al., 2015). While Leong (2011) found that social presence directly influenced cognitive absorption, which in turn impacted satisfaction.
Accordingly, the community of inquiry (CoI) model placed emphasis on the interdependent relationship among teaching presence, social presence, and cognitive presence, which collectively foster meaningful and effective learning experiences in online and blended educational environments. In studying student perceptions, strong evidence was found to exist for the interconnectedness of the three presences (Garrison, et al., 2010). Using structural equation modelling, Garrison et al. (2010) were able to tease out the relationship between the three presences, thereby demonstrating the importance of teaching presence as an influence on cognitive and social presence. It was also found that social presence predicted student perceptions of cognitive presence, confirming the mediating nature of social presence on cognitive and teaching presence. Using a mixed methods design, Ke (2010) noted the importance of teaching presence as a catalyst in the creation of a CoI and in ensuring the development of social and cognitive presence. Although not direct outcome measures of learning or knowledge gain as related to social presence, both studies were significant because they described the interrelatedness of the presences and the importance of teaching presence, and how social presence alone does not sustain or nurture critical inquiry (Bangert, 2008).
The impact of social presence is now recognized as being multi-faceted. This is because social presence can influence the effectiveness of online learning (Borup et al., 2012; Kim et al., 2011; Poth, 2018; Richardson & Swan, 2003), leading to higher rates of student satisfaction (Moallem, 2015; So & Brush, 2008), improved academic and learning performance (Hostteter & Busch, 2013; Joksimovic et al., 2015; Mykota, 2018; Richardson et al., 2017; Weidlich et al., 2023; Wise et al., 2004), and increased student retention rates (Bowers & Kumar, 2015; Robb & Sutton, 2014).
The theory of social presence has undergone significant evolution over the years, transitioning from a purely technologically determined environment in computer-mediated communication to one that is co-determined by both social and technological factors in digital learning spaces. At its origin, the concept of social presence was rooted in the sociological idea of co-presence, which Goffman (1959) described as the sensory awareness and contextual conditions that shape how individuals interact in face-to-face environments. The evolution of social presence theory in face-to-face communication was also influenced by social psychology, particularly in relation to concepts such as immediacy (Weiner & Mehrabian, 1968) and intimacy (Argyle & Dean, 1965). While immediacy described the psychological distance between communicators, intimacy reflected the closeness maintained through verbal and non-verbal cues (Rettie, 2003).
Short et al. (1976) further advanced the theory by applying social presence to the context of telecommunications. They defined social presence as the "degree of salience of another person in an interaction and the consequent salience of an interpersonal relationship" (p. 65), asserting that the social effects within computer-mediated communication environments stem from the level of social presence afforded to users. Building on this foundation, Gunawardena (1995) demonstrated that immediacy behaviours-actions that reduce psychological distance-serve to enhance and sustain social presence in online learning environments, thereby increasing the perception of participants as real individuals. This led to a reconceptualization of social presence, from a technologically determined phenomenon to one that is co-constructed through interpersonal and social interaction in educational settings (Gunawardena & Zittle, 1997; Tu & McIsaac, 2002).
Rourke, et al. (1999) using qualitative content analysis of online course transcripts, examined how social presence manifests in online learning. They identified three key indicators:
Building on this, Garrison et al. (2000) conceptualized social presence within the CoI framework as "the ability of participants in a community of inquiry to project themselves socially and emotionally, as 'real' people, through the medium of communication" (p. 94). Garrison (2009) later expanded this definition to emphasize the importance of identification with the community, purposeful communication, and relationship development within a trusting environment.
A distinctive aspect of Garrison's definition is its inclusion of the community element, positioning social presence as a function of group cohesion in educational settings. However, this community-focused perspective was not shared by all scholars. Kreijns et al. (2011), drawing on telepresence research by Lombard and Ditton (1997), defined social presence as "the degree of illusion that others appear to be a 'real' physical person in either an immediate (e.g., real-time/synchronous) or delayed (e.g., time-deferred/asynchronous) communication episode" (p. 366). Similarly, Biocca et al. (2003) and Kehrwald (2010) viewed social presence as existing along a continuum, varying in intensity, but notably excluded community or group dynamics in their definitions.
Kreijns et al. (2014) further critiqued the CoI model, arguing that it measured only aspects of social space (such as interpersonal salience) but neglected the psychological realism of individuals in online communication. As a result, they proposed a two-dimensional model of social presence to capture both interpersonal dynamics and individual realism.
Expanding on CoI research, Whiteside (2015) introduced the social presence model for online and blended learning environments. Grounded in Vygotsky's (1978) social development theory, this model departs from the traditional psychological perspectives (such as Short et al., 1976) by framing social presence as a critical literacy-a foundational competency for effective participation in digital learning environments. In this model, social presence is conceived not just as a psychological state, a set of behaviours, or a medium property, but as a comprehensive, overarching construct essential to meaningful engagement (Whiteside, 2017).
To enhance our understanding of the construct, Krejins et al. (2022) identified social space and sociability as linked to social presence, albeit with distinct conceptualizations. Therefore, disentangling and distinguishing these terms and focusing solely on social presence as a unique psychological phenomenon and construct is essential for resolving definitional issues that challenge social presence research (Krejins et al., 2022). As the literature illustrates, defining social presence has become increasingly complex (Kreijns et al., 2014; Krejins et al., 2022; Lowenthal, 2010). Whether it is seen as an internal state, a set of observable behaviours, a feature of communication technologies, or a critical digital literacy, social presence remains central to understanding effective affective communication in online learning contexts.
To aid in the development of social presence, a scoping study identified that student learning and satisfaction are the more robust of the student outcome variables reported in the research literature (Mykota, 2018). However, these findings were specific to independent individual studies with effect sizes varying across studies and the combined effects of the individual studies not statistically summarized. To date, the only attempt to systematically synthesize and quantify the research literature specifically on social presence was a meta-analysis that was limited to studies up until May 2015 (Richardson et al., 2017). Although it provided a measure of the effect social presence has on perceived student learning and student satisfaction, the number of studies included was small, thereby limiting the power of the study and the ability to investigate the heterogeneity of variance in effect sizes. Since the frequency of social presence research in a variety of online learning environments continues to increase (Krejins et al., 2022), the capacity to find and generate more stable and possibly convergent results for student learning and satisfaction outcomes of social presence is enhanced.
Systematic reviews and meta-analysis are becoming increasingly more popular within the social sciences. This is particularly true in the field of education and eLearning, in part because of the methodological rigour associated with systematic reviews and meta-analysis (Caskurlu et al., 2020; Chapman, 2021; Richardson et al., 2017; Zawacki-Richter et al., 2020). As such, a contemporary systematic review and meta-analysis on student learning and satisfaction outcomes for social presence is desirable. In turn, the study designs extracted can be replicated and previous results either confirmed or refuted. This systematic review and meta-analyses will allow for greater predictability in understanding the combined effects across studies that social presence has on student outcomes as facilitated through a current review of the literature. Therefore, the focused purpose of this study is to report the findings of the systematic review and meta-analyses on social presence and its association with both student learning and student satisfaction outcomes. The overarching questions to be answered are:
A systematic review of the literature was undertaken to obtain an appropriate sample of studies from the research for the student learning and student satisfaction meta-analyses, that is, the statistical summary of the studies' results. The sample for the synthesis was derived from those studies that reported effect sizes for the relationship between social presence and student learning and/or student satisfaction outcomes. For the systematic review, because few studies report actual academic performance as a measure of learning, the term was broadened to include perceived learning, which is the degree to which individuals believe knowledge has been acquired (Richardson et al., 2017; Weidlich et al., 2023). Student satisfaction then, is the degree to which students are satisfied with their online learning experience.
The study occurred in two phases. The first phase involved a systematic review of the literature which included: identifying relevant studies, screening and selecting studies, and undertaking the data extraction. The second phase comprised the meta-analyses for student learning and student satisfaction which included: entering the data, conducting the meta-analyses for student learning and student satisfaction, undertaking a subgroup analysis, and determining if publication bias exists.
Working in conjunction with the research librarians and using the research questions outlined, a preliminary list of key concepts was constructed. A series of search terms were created representative of social presence to facilitate the searching of the ERIC/OVID, APA PsychINFO, ProQuest Education, and Web of Science Core Collection databases. Using the list of key concepts, each broad term was mapped to terms in the databases to discover specific synonyms to search in the databases. In building the search terms, key concepts were combined using the Boolean operators "and" and "or", so that relevant articles from 1996 to 2022 were extracted and imported into Endnote. Additional literature was collected through hand-searching references, cross-checking references used in previous studies, conducting a complementary search of top authors, and finally conducting a Google Scholar search of key terms.
Inclusion criteria identified through the systematic review for the meta-analyses was based on the relevancy of the material to social presence and the identified outcomes of student learning and student satisfaction. The studies extracted for inclusion were based on the underpinnings of a social presence theoretical framework and appropriate scale-based measurement of the variables reported to enable calculation of the correlation coefficient between social presence and the identified outcomes.
The methods for the systematic review were modelled after The Campbell Collaboration (2020) guidelines for conducting systematic reviews, the PRISMA (2025) Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist for reporting, and the Moher et al. (2009) data collection flow diagram for systematic reviews and meta-analysis. A three-stage screening process was developed for the systematic review to determine the adequacy of studies that might fit the meta-analyses (see Figure 1). In the first stage, two reviewers conducted an initial title, abstract, and keyword screen for student outcome (learning and satisfaction) based social presence studies. Discrepancies were resolved either through consensus or, if needed, involvement of a third reviewer.
The second stage of the screening process involved material previously identified as uncertain. To ascertain if materials identified as uncertain from the initial title and abstract screen were suitable for detailed review, reviewers read the full text and reached a final consensus on relevancy. The reference lists of those studies reviewed in detail were then searched for additional studies to be reviewed that had not been found in the database search.
The third stage involved identifying those articles suitable for inclusion in the meta-analyses that were reviewed in detail in the first two stages. Inclusion criteria for the meta-analyses were based on the following conditions:
For the studies screened for inclusion through the above-mentioned process, a full-text read occurred and a data extraction form was completed. The data extraction form summarized the study features including: author, year, publication type, location of study, student type, course length, discipline area, type of learning environment (such as asynchronous online, synchronous online, blended, or massive open online courses referred to as MOOC), social presence scale used, sample size, method of data analysis, and potential bias reported. A categorical coding scheme was then developed for the subgroup analysis.
Figure 1. Flow Diagram for the Systematic Review Screening
The data extraction form was piloted by the research team and assessed for completeness and ease of use. The percentage agreement between reviewers involved in the assessment was targeted at >90%. Based on the pilot testing, minor modifications to the data extraction form were undertaken to ensure that the data necessary for addressing the research questions were obtained. All data collected in the data extraction process was then housed in a secure data storage environment maintained by the University of Saskatchewan, which was accessible only to members of the research team.
Social presence studies tend to use rating scales as measures of social presence (Biocca et al., 2003; Chen et al., 2015; Mykota, 2018). Therefore, for the meta-analyses, correlational coefficients were used to calculate the effect size and the correlation coefficient was transformed to the Fisher's Z score to normalize the distribution (Fisher, 1915; Hedges & Olkin, 1985). Two separate meta-analyses occurred: one for student learning outcomes and the other for student satisfaction outcomes. If a study reported both outcomes, they were treated as independent and entered separately for each outcome. Two manuscripts in the sample (Kang et al., 2014, i, ii, iii; Teng, 2005, i, ii) included more than one study for the two student outcomes and were noted as separate studies in the summary table. See Table 1 for the summary of studies included in the meta-analyses.
A random-effects model was chosen as the statistical model for analysis of the two outcome measures because this model allows for studies to have their own population size effects while acknowledging, as in these meta-analyses, that the studies included are treated as a sample of all the possible studies (Borenstein et al., 2010; Bornstein et al., 2021; Cheung et al., 2012). This takes into consideration that other studies might not be identified using the search criteria adopted for inclusion in the meta-analysis (Borenstein et al., 2010; Bornstein et al., 2021; Cheung et al., 2012). In contrast, a fixed effects model which was not used would assume that the studies included in the meta-analysis share a common effect size.
| Study | Sample | Student Outcome |
|---|---|---|
| Alsadoon (2018) | 73 | Satisfaction |
| Arbaugh (2013) | 543 | Learning/Satisfaction |
| Arbaugh (2014) | 634 | Learning/Satisfaction |
| Barbera et al. (2013) | 499 | Learning/Satisfaction |
| Cakiroglu (2019) | 72 | Learning |
| Chen et al. (2020) | 659 | Learning/Satisfaction |
| Chen et al. (2018) | 166 | Satisfaction |
| Cho et al. (2010) | 100 | Learning |
| Cobb (2011) | 128 | Learning/Satisfaction |
| Costley (2019) | 433 | Learning |
| Daigle & Stuvland (2021) | 144 | Learning |
| Dang et al. (2019) | 699 | Learning |
| Daspit & D'Souza (2012) | 203 | Learning |
| Garrison et al. (2010) | 205 | Learning |
| Horzum (2015) | 277 | Learning/Satisfaction |
| Hostetter & Busch (2006) | 112 | Satisfaction |
| Jaradat & Ajlouni (2020) | 435 | Satisfaction |
| Johnson et al. (2008) | 345 | Learning/Satisfaction |
| Johnson et al. (2009) | 914 | Learning/Satisfaction |
| Joo et al. (2011) | 709 | Learning/Satisfaction |
| Kang et al. (2014i) | 68 | Learning/Satisfaction |
| Kang et al. (2014ii) | 63 | Learning/Satisfaction |
| Kang et al. (2014iii) | 47 | Learning/Satisfaction |
| Kim (2011) | 221 | Learning/Satisfaction |
| Kim et al. (2011) | 81 | Satisfaction |
| Kovanovic at al. (2018) | 1487 | Learning |
| Kozuh et al. (2015) | 62 | Learning |
| Ku et al. (2012) | 201 | Satisfaction |
| Kucuk & Richardson (2019) | 115 | Learning/Satisfaction |
| Kyei-Blankson et al. (2019) | 144 | Learning |
| Law et al. (2019) | 207 | Learning |
| Nasir (2020) | 73 | Satisfaction |
| Natarajan & Joseph (2022) | 177 | Satisfaction |
| Newberry (2003) | 94 | Satisfaction |
| Nyachae (2011) | 81 | Learning/Satisfaction |
| Pillai & Sivathanu (2020) | 1390 | Satisfaction |
| Reio & Crim (2013) | 280 | Satisfaction |
| Saadatmand et al. (2017) | 30 | Learning |
| Shea & Bidjerano (2009) | 5024 | Learning |
| So & Brush (2008) | 48 | Satisfaction |
| Song et al. (2019) | 262 | Learning/Satisfaction |
| Sorden & Munene (2013) | 98 | Learning/Satisfaction |
| Spears (2012) | 152 | Satisfaction |
| Swan & Shih (2019) | 51 | Learning |
| Tan (2021a) | 282 | Learning |
| Tan (2021b) | 456 | Learning |
| Teng (2005i) | 46 | Learning/Satisfaction |
| Teng (2005ii) | 31 | Learning/Satisfaction |
| Turel (2016) | 81 | Learning |
| Weidlich & Bastiaens (2017) | 162 | Learning/Satisfaction |
| Yildrim & Seferogulu (2021) | 1178 | Learning |
| Yin & Yaun (2021) | 256 | Learning |
| Zhan & Mei (2013) | 136 | Learning/Satisfaction |
Since a random effects model for analysis was proposed, a test of power was conducted to determine if the number of studies included are sufficient for statistical power in testing the effect size and average effect sizes as a function of heterogeneity (Borenstein et al., 2009; Bornstein et al., 2021; Cheung & Vijayakumar, 2016; Valentine et al., 2010).
Heterogeneity of effect sizes were determined using the Q test, the I2 statistic, and the t2 (tau) statistic (Cochran, 1954). The Q-test for heterogeneity is a test of the null hypothesis that the studies in the analysis share a common effect size (Cochran, 1954). If all studies shared the same true effect size, the expected value of Q would be equal to the degrees of freedom (the number of studies minus 1). As mentioned, the between groups variance t2 statistic and the I2 statistic for understanding true heterogeneity were also determined. By using these three tests, a more accurate interpretation of the heterogeneity of effect sizes in the meta-analysis sample was made (Higgins & Thompson, 2002). Forest plots are reported to help in understanding the effect sizes across studies and in reporting the mean effect size and confidence interval with the prediction interval calculated to determine the degree that the true effects vary (Bornstein, 2019).
As it is likely that only studies with significant results are reported and published in the research literature, it is important, especially when conducting and interpreting a meta-analysis, to determine if publication bias exists. To this end, there are several methods for determining publication bias. For the meta-analyses in this study, funnel plots, and Egger's test of the intercept along with trim and fill procedures, were used to ascertain the impact publication bias had on the meta-analyses' findings (Rothstein et al., 2006). The software chosen for the analysis was Comprehensive Meta-Analysis (Version 4).
The learning meta-analysis results are based on 40 studies (see Figure 2). The mean effect size is 0.531 with a 95% confidence interval of 0.438 to 0.612. The Q-value is 12221.335 with 39 degrees of freedom and p < 0.001 with the null hypothesis being rejected that the true effect size is the same for all studies. The I-squared (I2) statistic indicates that 98% of the variance in observed effects can be attributed to variance in true effects rather than sampling error.
Figure 2. Learning Meta-Analysis Forest Plot
The Tau-squared (t2) statistic, representing the variance of true effect sizes, was estimated at 0.145 in Fisher's Z units. Additionally, the standard deviation of true effect sizes, Tau, was calculated to be 0.381 in Fisher's Z units. Considering a normal distribution of true effects (in Fisher's Z units), the estimated prediction interval ranges from -0.187 to 0.879, encompassing 95% of comparable studies.
The satisfaction meta-analysis results are based on 34 studies (see Figure 3). The mean effect size is 0.503 with a 95% confidence interval of 0.427 to 0.572. The Q-value is 669.384 with 33 degrees of freedom and p < 0.001, rejecting the null hypothesis that the true effect size is the same for all studies. The I-squared (I2) statistic indicates that 95% of the variance in observed effects can be attributed to variance in true effects rather than sampling error. The Tau-squared (t2) statistic, representing the variance of true effect sizes, was estimated at 0.075 in Fisher's Z units. Additionally, the standard deviation of true effect sizes, Tau, was calculated to be 0.273 in Fisher's Z units. Considering a normal distribution of true effects (in Fisher's Z units), the estimated prediction interval ranges from -0.012 to 0.807, encompassing 95% of comparable studies.
In using a random effects model for analysis, power is influenced not only by the total sample size but also by the number of studies (Bornstein et al., 2021; Cheung & Vijayakumar, 2016; Valentine et al., 2010). Given the heterogeneity reported for both the learning and the satisfaction meta-analysis, the test of power tables as a function of the number of studies and heterogeneity as provided by Bornstein et al. (2021) were consulted, and it was determined to be .99 for the learning meta-analysis and .95 for the satisfaction meta-analysis.
Figure 3. Satisfaction Meta-Analysis Forest Plot
A subgroup analysis was undertaken for both student learning and satisfaction outcomes. The purpose of undertaking the subgroup analysis was to help understand if there are differences in the true means between subgroups as moderators which might explain some of the heterogeneity reported. This type of analysis is described as a mixed effects analysis because the subgroups are fixed, and the studies included are viewed as a random sample of applicable studies (Bornstein et al., 2021). To facilitate the analysis, variables from the subgroups were obtained from the data extraction form(s) and coded into categorical variables.
For both the learning and satisfaction meta-analyses, the subgroup categorical variables included:
When the study used in the meta-analysis for learning and/or satisfaction reported combined results for both undergrad and graduate students, a subgroup specific to undergrad/graduate students was created. Similarly, when the study used in the meta-analysis for learning and/or satisfaction outcomes reported combined results for a variety of subject disciplines, a multi-disciplinary subject type was created. The subgroups for date of publication were grouped into three-year increments, except for the first group due to the low number of earlier studies. For the instrument subgroup, when there was only one study cited for an instrument used or the instrument was researcher designed, a separate researcher designed (RD) category was created. Subgroup analysis for both the learning and satisfaction meta-analysis revealed non-significant results for course length, location, and student type. Significant subgroup analysis results were found in the mode of delivery and the instrument subgroups.
The results of the instrument subgroup analysis for student learning and satisfaction meta-analyses outcomes are displayed in Table 2 and Table 3. What can be observed from the instrument subgroup analysis for the learning meta-analysis is that the effect of social presence is generally moderate as measured by the various instruments, with point estimates ranging from .422 to .695, except for those studies that used the SPSa (Kang et al., 2007) point estimate of 0.212 and the Short et al. (1976) scale point estimate of .136 (see Table 2). In the instrument subgroup analysis for the satisfaction meta-analysis, the mean effect of social presence is also moderate as measured by the various instruments with point estimates ranging from .395 to .683, except for those studies that used the CMCQ (Tu, 2002) where the point estimate was 0.091 (see Table 3).
The results of the mode of delivery subgroup analysis for the student learning meta-analysis and the student satisfaction meta-analysis outcomes are displayed in Table 4 and Table 5, respectively. What is observed from the mode of delivery subgroup analysis for the learning meta-analysis is that effect of social presence is moderate with point estimates for the mode of delivery subgroups ranging from .502 to .649, except for the MOOC mode of delivery that had a point estimate of .220 (see Table 4). It should be noted that only one case of a MOOC as a mode of delivery for the learning meta-analysis was reported. When that case is removed from the analysis, the point estimates are unchanged for the various categories subsumed under mode of delivery and the results are not significant. The choice to report the MOOC mode of delivery category was made to illustrate the difference in the point estimate found. As well, the point estimate for the synchronous mode of delivery was higher than the point estimates for the other modes of delivery. In the mode of delivery subgroup analysis for the satisfaction meta-analysis, the mean effect of social presence is also moderate as measured by the various modes of delivery with point estimates ranging from .520 to .587, except for the MOOC mode of delivery that had a point estimate of .093 (see Table 5). Further, like the learning subgroup analysis for mode of delivery, the point estimate for the synchronous mode of delivery was slightly higher at .587 although still within the moderate range.
| Mixed Effects Analysis | |||||||||
|---|---|---|---|---|---|---|---|---|---|
| Instrument1 | Number of Studies | Point Estimate | Z Value | P Value | Tau | TauSq | Q Value | df Q | P Value |
| SPI | 2 | 0.422 | 2.063 | 0.039 | 0.294 | 0.086 | - | - | - |
| CoI | 19 | 0.604 | 7.776 | 0.000 | 0.384 | 0.148 | - | - | - |
| SPSa | 3 | 0.212 | 2.797 | 0.005 | 0.000 | 0.000 | - | - | - |
| RD | 11 | 0.478 | 6.822 | 0.000 | 0.229 | 0.053 | - | - | - |
| Short | 2 | 0.136 | 4.835 | 0.000 | 0.000 | 0.000 | - | - | - |
| SPRES | 3 | 0.695 | 9.150 | 0.000 | 0.131 | 0.017 | - | - | - |
| Total Between | - | - | - | - | - | - | 96.28 | 5 | 0.000* |
| Overall | 40 | 0.259 | 11.398 | 0.000 | 0.381 | 0.145 | - | - | - |
Learning Mean Effect Size .531; *p<.001. Instrument1: SPI, Social Presence Inventory, Biocca et al., 2001; CoI, Community of Inquiry survey, Arbaugh et al, 2008; SPSa, Social Presence Scale, Kang et al., 2007; RD, researcher designed; Short et al., 1976; SPRES, Social Presence Scale, Gunawardena & Zittle, 1997.
| Mixed Effects Analysis | |||||||||
|---|---|---|---|---|---|---|---|---|---|
| Instrument1 | Number of Studies | Point Estimate | Z Value | P Value | Tau | TauSq | Q Value | df Q | P Value |
| SPI | 3 | 0.395 | 3.438 | 0.001 | 0.193 | 0.037 | - | - | - |
| CMCQ | 2 | 0.091 | 1.257 | 0.209 | 0.025 | 0.001 | - | - | - |
| CoI | 7 | 0.479 | 5.589 | 0.000 | 0.236 | 0.056 | - | - | - |
| SPSa | 3 | 0.439 | 3.786 | 0.000 | 0.169 | 0.028 | - | - | - |
| SPSb | 2 | 0.522 | 4.488 | 0.000 | 0.158 | 0.025 | - | - | - |
| RD | 9 | 0.487 | 4.281 | 0.000 | 0.354 | 0.125 | - | - | - |
| Short | 2 | 0.523 | 14.970 | 0.000 | 0.035 | 0.001 | - | - | - |
| SPRES | 6 | 0.683 | 17.467 | 0.000 | 0.087 | 0.008 | - | - | - |
| Total Between | - | - | - | - | - | - | 77.091 | 7 | 0.000* |
| Overall | 34 | 0.519 | 11.398 | 0.000 | 0.273 | 0.175 | - | - | - |
Satisfaction Mean Effect Size .503; *p<.001. Instrument1: SPI, Social Presence Inventory, Biocca et al., 2001; CMCQ, CMC Questionnaire, Tu, 2002; CoI, Community of Inquiry survey, Arbaugh et al, 2008; SPSa, Social Presence Scale, Kang et al., 2007; SPSb, Social Presence Scale, Kim, 2011; RD, researcher designed; Short et al., 1976; SPRES, Social Presence Scale, Gunawardena & Zittle, 1997.
| Mixed Effects Analysis | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Mode of Learning | Number of Studies | Point Estimate | Z Value | P Value | Tau | TauSq | Q Value | df Q | P Value | |
| Asynch | 26 | 0.502 | 9.470 | 0.000 | 0.282 | 0.080 | - | - | - | |
| Blended | 7 | 0.553 | 3.353 | 0.001 | 0.481 | 0.231 | - | - | - | |
| MOOC1 | 1 | 0.220 | 5.728 | 0.000 | 0.000 | 0.000 | - | - | - | |
| Synch | 6 | 0.649 | 4.161 | 0.000 | 0.446 | 0.119 | - | - | - | |
| Total Between | - | - | - | - | - | - | 29.774 | 3 | 0.000* | |
| Overall | 40 | 0.333 | 11.012 | 0.000 | 0.381 | 0.145 | - | - | - | |
Learning Mean Effect Size .531; *p<.001. 1 When MOOC is removed from the subgroup analysis, the non-significant p value is .505.
| Mixed Effects Analysis | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Mode of Learning | Number of Studies | Point Estimate | Z Value | P Value | Tau | TauSq | Q Value | df Q | P Value | |
| Asynch | 25 | 0.520 | 12.712 | 0.000 | 0.207 | 0.043 | - | - | - | |
| Blended | 3 | 0.509 | 4.508 | 0.000 | 0.182 | 0.033 | - | - | - | |
| MOOC | 2 | 0.093 | 4.225 | 0.000 | 0.000 | 0.000 | - | - | - | |
| Synch | 4 | 0.587 | 5.717 | 0.000 | 0.215 | 0.46 | - | - | - | |
| Total Between | - | - | - | - | - | - | 116.445 | 3 | 0.000* | |
| Overall | 34 | 0.206 | 10.770 | 0.000 | 0.273 | 0.075 | - | - | - | |
Satisfaction Mean Effect Size .503; * p<.001.
What the subgroup comparative analyses reveals is that for the two subgroups, significant instrument and mode of delivery effects were found. Although these significant effects help in understanding why heterogeneity was demonstrated and moderate both the learning and satisfaction meta-analyses, it is important to understand that the results are best viewed as being observational and not evidence of a causal relationship (Bornstein et al., 2021).
Publication bias occurs when the studies included in an analysis differ systematically from the full set of studies that should have been included (Bornstein et al., 2021). To evaluate potential publication bias, funnel plots, trim and fill procedures, and Egger's test of the intercept are utilized in both the learning and satisfaction meta-analyses. In funnel plots, larger studies typically appear towards the top, while smaller studies are situated towards the bottom. In an unbiased scenario, studies would distribute symmetrically around the mean. However, bias may manifest as an asymmetrical distribution, particularly with a greater number of smaller studies showing larger effect sizes congregating on one side of the mean at the bottom of the graph, indicating potential publication of studies with significant results only.
As observed in the learning funnel plot, there does not appear to be an asymmetrical distribution that is statistically significant (see Figure 4). This is confirmed by the nonsignificant results for Egger's test of the intercept (1-tailed p-value 0.07653, and the 2-tailed p-value 0.15307). This is not the case however, for the satisfaction meta-analysis as observed by the asymmetrical distribution confirmed statistically by Egger's test of the intercept for the satisfaction funnel plot (1-tailed p-value is 0.01266, and the 2-tailed p-value is 0.02531) (see Figure 5).
To further address any potential bias, Duval and Tweedie's (2000) method is also used, allowing for the estimation and inclusion of missing studies. This involves determining where the absent studies are likely to fall within a significant asymmetrical funnel plot and including them in the analysis and then recalculating the combined effect. In the funnel plot for learning, using the random effects model, the point estimate and 95% confidence interval for the combined studies is 0.53056 (0.43767, 0.61225). Using Duval and Tweedie's (2000) trim and fill method, these values are unchanged, suggesting no studies are missing. However, the significant asymmetrical funnel plot for satisfaction, using Duval and Tweedie's (2000) trim and fill method, suggests 10 studies are missing with the imputed studies on the left side as represented by the black circles in Figure 5. Under the random effects model, the point estimate and 95% confidence interval for the combined studies is 0.50328 (0.42721, 0.57227). Using trim and fill, the imputed point estimate is 0.39425 (0.30972, 0.47260). What this tells us is that even with the imputed studies included in the analysis, the point estimate is still within the moderate range. What the trim and fill procedures illustrate is that for both the learning meta-analysis and the satisfaction meta-analysis, the impact of publication bias is negligible; even if all possible relevant studies were included, the effect size would remain mainly unchanged.
Figure 4. Publication Bias Funnel Plot for Learning
Figure 5. Publication Bias Funnel Plot for Satisfaction
The systematic review and meta-analyses on the effects of social presence on perceived student learning and student satisfaction revealed that social presence has a moderate overall impact on student learning and student satisfaction. The strength of this study lies in the samples derived for the analysis that included 17,134 students for the learning meta-analysis derived from over 40 studies and 9,230 students for the satisfaction meta-analysis derived from 34 studies. These robust sample sizes allowed for the exploration of heterogeneity of variances in effect sizes and to generalize the findings regarding the importance of social presence and its continuing relationship to student learning and student satisfaction outcomes.
For both meta-analyses in this study, publication bias did not appear to impact the point estimates substantially as evidenced by the funnel plots observed, and in the case of the satisfaction meta-analysis, the imputed point estimates. For the subgroup analysis, non-significant findings were reported for length of course, date of publication, student type, and subject discipline. These results contrast with those previously reported by Richardson et al. (2017) in which course length, audience, and subject discipline were found to moderate the effects. This could be because unlike the results reported for the current meta-analyses, the sample size for Richardson et al.'s (2017) was small, which limited generalizability and investigations into the heterogeneity of variances as reported by the authors.
In the present study, there were significant subgroup differences observed for both the mode of delivery and instrument subgroups in the learning and satisfaction meta-analyses. It was found that social presence had a lower effect on student learning and satisfaction outcomes in MOOC environments, compared to the moderate effects observed in other modes of delivery. Although still within the moderate range the synchronous environments exhibited slightly higher social presence effects, as indicated by the point estimates reported in both the learning and the satisfaction subgroup analysis for mode of delivery. Within this context, the question arises: Is operationalization of social presence consistent across learning environments? Or, do the various learning environments (asynchoronous, synchronous, blended, MOOC, and virtual) have inherent benefits or drawbacks for social presence?
In the learning and satisfaction instrument subgroup analyses for this study, not all instruments demonstrated moderate effects. Unlike the mode of delivery subgroup analyses for learning and satisfaction where MOOC point estimates were lower, and synchronous point estimates were higher, there was no consistency in the three instruments identified as having significant differences in their point estimates in the instrument subgroup analyses for learning and satisfaction.
A limitation of the subgroup analysis is the uneven sample sizes found for the instrument and the mode of delivery subgroups for both learning and satisfaction. Unbalanced subgroups can affect the power of the subgroup analysis (Cuijpers et al., 2021). Because of this, one should be cautious in attributing causality to the findings. Nonetheless, it does help in understanding why heterogeneity is present for both the learning and satisfaction meta-analyses. For example, the CoI survey was more frequently used in studies pertaining to student learning and satisfaction outcomes than any other instrument. This was most prevalent in the learning meta-analysis instrument subgroup. As well, there was a preponderance of researcher-designed instruments and instruments used only once. The current study found significant subgroup effects for the instrument analysis for both the learning and satisfaction meta-analyses. These results contrast with those previously reported in which the instrument scale used was found only to moderate the satisfaction meta-analysis (Richardson et al., 2017).
Although significant effects were observed for both the mode of delivery and instrument subgroups, attributing these differences solely to the instruments or mode of delivery would be misleading. Other factors, such as the learning environment/instructional design, instructor behaviours, student psychological constructs (motivation and self-efficacy), or other unreported variables, could have influenced the results. A further limitation is that insufficient information was reported in the findings of the studies included in the present meta-analyses, which did not allow for the inclusion of course design variables in a moderator analysis. Finally, there is an overreliance on subjective measures, such as student perceptions and surveys, which may also limit the assessment of social presence. This reliance (Biocca et al., 2003; Chen et al., 2015) raises concerns about objectivity and consistency across studies, not only in measuring the construct itself, but also in reporting student learning and satisfaction outcomes.
With few studies reporting academic outcomes, it was necessary to be more inclusive in defining the learning variable. As a result, a larger study and subject sample was obtained for the learning meta-analysis. This larger sample potentially impacted the non-significant findings for the learning publication bias analysis. While the subgroup analysis did identify significant effects for the mode of delivery and instrument, it is important to recognize that these are observational and not causal results.
The educational implications of this study for online learning in higher education relate to the continued need to enable the affective development of social presence to create a satisfying interconnected learning environment for both instructors and students. There are varying complexities in the different instructional and learning environments afforded users in blended learning, asynchronous, and synchronous learning settings, and more recently MOOCs. These complexities need be considered in facilitating the development of the construct. As the results of this study indicate, there are challenges in how the affective domain can be uniformly implemented.
As digital learning spaces evolve, a more nuanced understanding of the interplay between human behaviour and technological affordances is key to building a purposeful foundation for social presence. To accomplish this, placing greater emphasis on research objectives that integrate course design elements with individual characteristics and behaviours can more effectively leverage the social and technological components essential for a meaningful foundation on which social presence can be established. Although course design features create the conditions necessary for social engagement, they should be aligned with learners' personal attributes (Cho et al., 2010) that include communication styles (Gunawardena, 1995; Tu, 2001), motivation (Chen et al., 2020), and cultural context (Jaradat & Ailouni, 2020; Tan, 2021a; Tu, 2001). In addition to instructors' teaching presence and responsiveness (Kyei-Blankson et al., 2019; Song & Park, 2019). When design and personal variables are thoughtfully combined, they foster a richer, more humanized learning environment that supports emotional connection, participation, and sustained engagement. This alignment also reflects principles from the CoI framework, which positions social presence as a core element in conjunction with cognitive and teaching presence, reinforcing the idea that meaningful learning emerges through the interplay of people, pedagogy, and technology (Garrison, 2009; Garrison et al., 2010; Garrison, 2017).
The systematic review and meta-analyses reveal that social presence continues to exert a moderate effect on both student learning and satisfaction. An important observation is the inconsistency in reporting psychometric properties in the scale-based measures used in studies, and methods used for analyses that did not enable the calculation of correlation coefficients and support statistically valid analyses, which excluded studies from the met-analyses. Although there are varying measures for social presence as based on the definitions that underpin the construct with significant effects found for a few of the instruments in the subgroup analyses, it is important to understand that these effects are observational and not casual as other variables not reported in the studies extracted for the meta-analyses could have influenced the results.
In conclusion, greater attention to reporting moderator variables, that include course design, instructor, and student behaviours, is recommended to provide a more nuanced understanding of the effects of social presence on student learning and satisfaction outcomes. Further, continued research on the effects that MOOCs have on social presence is needed to better understand how best to create social presence for this mode of delivery. Future research should focus on advancing our understanding of the nomological network that underpins social presence, bridging theoretical and empirical frameworks across diverse learning environments. This approach will enhance rigour in research design and facilitate empirically validated investigations into the effectiveness of online learning in higher-education settings.
Funding for this research has been provided by the Social Sciences and Humanities Research Council of Canada's Insight Grant program.
References marked with an asterisk indicate studies included in the meta-analysis.
*Alsadoon, E. (2018). The impact of social presence on learners' satisfaction in mobile learning. TOJET the Turkish Online Journal of Educational Technology, 17(1). http://cyber.usask.ca/login?url=https://www.proquest.com/scholarly-journals/impact-social-presence-on-learners-satisfaction/docview/2025352417/se-2?accountid=14739
Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3), 133-136. https://doi.org/10.1016/j.iheduc.2008.06.003
*Arbaugh, J. B. (2013). Does academic discipline moderate CoI course outcomes relationships in online MBA courses? The Internet and Higher Education, 17, 16-28. http://dx.doi.org/10.1016/j.iheduc.2012.10.002
*Arbaugh, J. B. (2014). System, scholar or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning, 30(4), 349-362. https://doi.org/10.1111/jcal.12048
Argyle, M., & Dean, J. (1965). Eye contact, distance, and affiliation. Sociometry, 28, 289-304. https://doi.org/10.2307/2786027
Bangert, A. (2008). The influence of social presence and teaching presence on the quality of online critical inquiry. Journal of Computing in Higher Education, 20(1), 34-61. https://doi.org/10.1007/BF03033431
*Barbera, E., Clara, M., & Linder-Vanberschot, J. A. (2013). Factors influencing student satisfaction and perceived learning in online courses. E-Learning and Digital Media, 10(3), 226-235. https://doi.org/10.2304/elea.2013.10.3.226
Biocca, F., Harms, C., & Burgoon, J. K. (2003). Toward a more robust theory and measure of social presence: Review and suggested criteria. Presence: Teleoperators and Virtual Environments, 12(5), 456-480. https://doi.org/10.1162/105474603322761270
Biocca, F., Harms, C., & Gregg, J. (2001). The networked minds measure of social presence: Pilot test of the factor structure and concurrent validity. Proceedings of the Fourth Annual International Presence Workshop (Presence 2001). Philadelphia, PA. https://ispr.info/presence-conferences/previous-conferences/presence-2001/
Borenstein, M. (2019). Common mistakes in meta-analysis and how to avoid them. Biostat.
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2009). Introduction to meta-analysis. John Wiley & Sons.
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2010). A basic introduction to fixed-effect and random-effects models for meta-analysis. Research Synthesis Methods, 1(2), 97-111. https://doi.org/10.1002/jrsm.12
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2021). Introduction to meta-analysis (2nd ed.). John Wiley & Sons.
Borup, J., West, R. E., & Graham, C. R. (2012). Improving online social presence through asynchronous video. The Internet and Higher Education, 15(3), 195-203. https://doi.org/10.1016/j.iheduc.2011.11.001
Bowers, J., & Kumar, P. (2015). Students' perceptions of teaching and social presence: A comparative analysis of face-to-face and online learning environments. International Journal of Web-Based Learning and Teaching Technologies, 10(1), 27-44. http://dx.doi.org/10.4018/ijwltt.2015010103
*Cakiroglu, U. (2019). Community of inquiry in web conferencing: Relationships between cognitive presence and academic achievements. Open Praxis, 11(3), 243-260. https://doi.org/10.5944/openpraxis.11.3.968
Caskurlu, S., Maeda, Y., Richardson, J. C., & Lv, J. (2020). A meta-analysis addressing the relationship between teaching presence and students' satisfaction and learning. Computers & Education, 157, 103966. https://doi.org/10.1016/j.compedu.2020.103966
Chapman, K. (2021). Characteristics of systematic reviews in the social sciences. The Journal of Academic Librarianship, 47(5). https://doi.org/10.1016/j.acalib.2021.102396
Chen, X., Fang, Y., & Lockee, B. (2015). Integrative review of social presence in distance education: Issues and challenges. Educational Research and Reviews, 10(13), 1796-1806. https://doi.org/10.5897/ERR2015.2276
*Chen, Y., Gao, Q., Yuan, Q., & Tang, Y. (2020). Discovering MOOC learner motivation and its moderating role. Behaviour & Information Technology, 39(12), 1257-1275. https://doi.org/10.1080/0144929X.2019.1661520
*Chen, C., Jones, K. T., & Xu, S. (2018). The association between students' style of learning preferences, social presence, collaborative learning and learning outcomes. The Journal of Educators Online, 15(1). https://doi.org/10.9743/JEO2018.15.1.3
Cheung, M. W. L., Ho, R. C., Lim, Y., & Mak, A. (2012). Conducting a meta-analysis: Basics and good practices. International Journal of Rheumatic Diseases, 15(2), 129-135. https://doi.org/10.1111/j.1756-185X.2012.01712.x
Cheung, M. W. L., & Vijayakumar, R. (2016). A guide to conducting a meta-analysis. Neuropsychology Review, 26(2), 121-128. https://doi.org/10.1007/s11065-016-9319-z
*Cho, M. H., Demei, S., & Laffey, J. (2010). Relationships between self-regulation and social experiences in asynchronous online learning environments. Journal of Interactive Learning Research, 21(3), 297-316. https://www.learntechlib.org/primary/p/29491/
*Cobb, S. C. (2011). Social presence, satisfaction, and perceived learning of RN-to-BSN students in web-based nursing courses. Nursing Education Perspectives, 32(2), 115-119. https://doi.org/10.5480/1536-5026-32.2.115
Cochran, W. G. (1954). The combination of estimates from different experiments. Biometrics, 10(1), 101-129. https://doi.org/10.2307/3001666
*Costley, J. (2019). The relationship between social presence and cognitive load. Interactive Technology and Smart Education, 16(2), 172-182. https://doi.org/10.1108/ITSE-12-2018-0107
Cuijpers, P., Griffin, J. W., & Furukawa, T. A. (2021). The lack of statistical power of subgroup analyses in meta-analyses: a cautionary note. Epidemiology and Psychiatric Sciences, 30, e78. https://doi.org/10.1017/S2045796021000664
*Daigle, D. T., & Stuvland, A. (2021). Social presence as best practice: The online classroom needs to feel real. PS, Political Science & Politics, 54(1), 182-183. https://doi.org/10.1017/S1049096520001614
*Dang, M. Y., Zhang, G. Y., & Amer, B. (2019). Social networks among students, peer TAs, and instructors and their impacts on student learning in the blended environment: A model development and testing. Communications of the Association for Information Systems, 44(1), 764-782. https://doi.org/10.17705/1CAIS.04436
*Daspit, J. J., & D'Souza, D. E. (2012). Using the community of inquiry framework to introduce Wiki environments in blended-learning pedagogies: Evidence from a business capstone course. Academy of Management Learning & Education, 11(4), 666-683. https://doi.org/10.5465/amle.2010.0154
Dewey, J. (1963). Experience and education. Collier.
Duval, S., & Tweedie, R. (2000). Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455-463. https://doi.org/10.1111/j.0006-341x.2000.00455.x
Fisher, R. A. (1915). Frequency distribution of the values of the correlation coefficient in samples from an indefinitely large population. Biometrika, 10(4), 507-521. https://doi.org/10.2307/2331838
Garrison, D. R. (2009). Communities of inquiry in online learning. In P. Rogers, G. Berg, J. Boettcher, C. Howard, L. Justice, & K. Schen (Eds.), Encyclopedia of distance learning (2nd ed), pp. 352-355. IGI Global. https://doi.org/10.4018/978-1-60566-198-8.ch052
Garrison, D. R. (2017). E-Learning in the 21st Century: A community of inquiry framework for research and practice (3rd ed.). Routledge. https://doi.org/10.4324/9781315667263
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2, 87-105. http://dx.doi.org/10.1016/S1096-7516(00)00016-6
*Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. The Internet and Higher Education, 13(1), 31-36. https://doi.org/10.1016/j.iheduc.2009.10.002
Goffman, E. (1959). The presentation of self in everyday life. Anchor.
Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. International Journal of Educational Telecommunications, 1(2/3), 147-166. https://www.learntechlib.org/primary/p/15156/
Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer-mediated conferencing environment. The American Journal of Distance Education,11(3), 8-26. https://doi.org/10.1080/08923649709526970
Hedges, L., & Olkin, I. (1985). Statistical models for meta-analysis. Academic Press.
Higgins, J. P., & Thompson, S. G. (2002). Quantifying heterogeneity in a meta-analysis. Statistics in Medicine, 21(11), 1539-1558. https://doi.org/10.1002/sim.1186
Hiltz, S. R. (1994). The virtual classroom: Learning without limits via computer networks. Intellect Books.
*Horzum, M. B. (2015). Interaction, structure, social presence, and satisfaction in online learning. Eurasia Journal of Mathematics, Science and Technology Education, 11(3), 505-512. https://doi.org/10.12973/eurasia.2014.1324a
*Hostetter, C., & Busch, M. (2006). Measuring up online: The relationship between social presence and student learning satisfaction. The Journal of Scholarship of Teaching and Learning, 6(2), 1-12. https://scholarworks.iu.edu/journals/index.php/josotl/article/view/1670
Hostetter, C., & Busch, M. (2013). Community matters: Social presence and learning outcomes. Journal of the Scholarship of Teaching and Learning, 13(1), 77-86. https://files.eric.ed.gov/fulltext/EJ1011685.pdf
Hurst, B., Wallace, R., & Nixon, S. B. (2013). The impact of social interaction on student learning. Reading Horizons, 52(4), 375-398. https://scholarworks.wmich.edu/reading_horizons/vol52/iss4/5/
*Jaradat, S., & O. Ajlouni, A. (2020). Social presence and self-efficacy in relation to student satisfaction in online learning setting: A predictive study. International Journal of Education and Practice, 8(4), 759-773. https://doi.org/10.18488/journal.61.2020.84.759.773
*Johnson, R. D., Hornik, S., & Salas, E. (2008). An empirical examination of factors contributing to the creation of successful e-learning environments. International Journal of Human-Computer Studies, 66(5), 356-369. https://doi.org/10.1016/j.ijhcs.2007.11.003
*Johnson, R. D., Gueutal, H., & Falbe, C. M. (2009). Technology, trainees, metacognitive activity and e-learning effectiveness. Journal of Managerial Psychology, 24(6), 545-566. https://doi.org/10.1108/02683940910974125
Joksimovic, S., Gasevic, D., Kovanovic, V., Riecke, B. E., & Hatala, M. (2015). Social presence in online discussions as a process predictor of academic performance. Journal of Computer Assisted Learning, 31(6), 638-654. https://doi.org/10.1111/jcal.12107
*Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students' satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictors in a structural model. Computers & Education, 57(2), 1654-1664. https://doi.org/10.1016/j.compedu.2011.02.008
Kang, M., Choi, H., & Park, S. (2007). Construction and validation of a social presence scale for measuring online learners' involvement. In C. Montgomerie & J. Seale (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2007 (pp. 1829-1833). AACE. https://www.learntechlib.org/primary/p/25619/
*Kang, M., Liew, B. T., Kim, J., & Park, Y. (2014). Learning presence as a predictor of achievement and satisfaction in online learning environments. International Journal on E-Learning, 13(2), 193-208. https://www.learntechlib.org/primary/p/39342/
Ke, F. (2010). Examining online teaching, cognitive, and social presence for adult students. Computers & Education, 55(2), 808-820. https://doi.org/10.1016/j.compedu.2010.03.013
Kehrwald, B. (2010). Being online: Social presence as subjectivity in online learning. London Review of Education, 8(1), 39-50. https://doi.org/10.1080/14748460903557688
*Kim, J. (2011). Developing an instrument to measure social presence in distance higher education. British Journal of Educational Technology, 42(5), 763-777. https://doi.org/10.1111/j.1467-8535.2010.01107.x
*Kim, J., Kwon, Y., & Cho, D. (2011). Investigating factors that influence social presence and learning outcomes in distance higher education. Computers & Education, 57(2), 1512-1520. https://doi.org/10.1016/j.compedu.2011.02.005
*Kovanovic, V., Joksimovic, S., Poquet, O., Hennis, T., Čukic, I., de Vries, P., Hatala, M., Dawson, S., Siemens, G., & Gasevic, D. (2018). Exploring communities of inquiry in Massive Open Online Courses. Computers & Education, 119, 44-58. https://doi.org/10.1016/j.compedu.2017.11.010
*Kozuh, I., Jeremic, Z., Sarjas, A., Julija, L. B., Devedzic, V., & Debevc, M. (2015). Social Presence and interaction in learning environments: The effect on student success. Journal of Educational Technology & Society, 18(1), 223-236.
Kreijns, K., Kirschner, P. A., Jochems, W., & van Buuren, H. (2011). Measuring perceived social presence in distributed learning groups. Education and Information Technologies, 16(4), 365-381. https://doi.org/10.1007/s10639-010-9135-7
Kreijns, K., Van Acker, F., Vermeulen, M., & Van Buuren, H. (2014). Community of inquiry: Social presence revisited. E-Learning and Digital Media, 11(1), 5-18. http://dx.doi.org/10.2304/elea.2014.11.1.5
Kreijns, K., Xu, K., & Weidlich, J. (2022). Social presence: Conceptualization and measurement. Educational Psychology Review, 34(1), 139-170. https://doi.org/10.1007/s10648-021-09623-8
*Ku, F., Ho, E., & Lam, P. (2012). Facebook for teaching and learning and its effect on social presence and sense of community. International Conference on e-Learning. Kidmore End, Academic Conferences International Limited. https://www.proceedings.com/content/017/017678webtoc.pdf
*Kucuk, S., & Richardson, J. C. (2019). A structural equation model of predictors of online learners' engagement and satisfaction. Online Learning, 23(2), 196-216. https://doi.org/10.24059/olj.v23i2.1455
*Kyei-Blankson, L., Ntuli, E., & Donnelly, H. (2019). Establishing the importance of interaction and presence to student learning in online environments. Journal of Interactive Learning Research, 30(4), 539-560. Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/161956/
*Law, K. M., Geng, S., & Li, T. (2019). Student enrollment, motivation and learning performance in a blended learning environment: The mediating effects of social, teaching, and cognitive presence. Computers & Education, 136, 1-12. https://doi.org/10.1016/j.compedu.2019.02.021
Leong, P. (2011). Role of social presence and cognitive absorption in online learning environments. Distance Education, 32(1), 5-28. https://doi.org/10.1080/01587919.2011.565495
Liaw, S. S., & Huang, H. M. (2000). Enhancing interactivity in web-based instruction: A review of the literature. Educational Technology, 40(3), 41-45.
Lombard, M., & Ditton, T. (1997). At the heart of it all: The concept of presence. Journal of Computer-Mediated Communication, 3(2). 1-11. https://doi.org/10.1111/j.1083-6101.1997.tb00072.x
Lowenthal, P. R. (2010). The evolution and influence of social presence theory on online learning, In Kidd, T. (Ed.), Online education and adult learning: New frontiers for teaching practices (pp.124-139), IGI Global. http://dx.doi.org/10.4018/9781605669847.ch010
Moallem, M. (2015). The impact of synchronous and asynchronous communication tools on learner self-regulation, social presence, immediacy, intimacy and satisfaction in collaborative online learning. The Online Journal of Distance Education and E-Learning, 3(3), 55-77.
Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Medicine, 6(7), 1‒6. https://doi.org/10.1136/bmj.b2535
Mykota, D. (2018). The effective affect: A scoping review of social presence. International Journal of E-Learning & Distance Education, 33(2), 1-30. https://www.ijede.ca/index.php/jde/article/view/1080
*Nasir, M. K. M. (2020). The influence of social presence on students' satisfaction toward online course. Open Praxis, 12(4), 485-493. https://doi.org/10.5944/openpraxis.12.4.1141
*Natarajan, J., & Joseph, M. A. (2022). Impact of emergency remote teaching on nursing students' engagement, social presence, and satisfaction during the COVID-19 pandemic. Nursing Forum, 57(1), 42-48. https://doi.org/10.1111/nuf.12649
*Newberry, B. (2003). Effects of social motivation for learning and student social presence on engagement and satisfaction in online classes [Unpublished doctoral dissertation, University of Kansas].
*Nyachae, J. N. (2011). The effect of social presence on students' perceived learning and satisfaction in online courses [Unpublished doctoral dissertation, Kansas University].
*Pillai, R., & Sivathanu, B. (2020). An empirical study on the online learning experience of MOOCs: Indian students' perspective. International Journal of Educational Management, 34(3), 586-609. https://doi.org/10.1108/IJEM-01-2019-0025
Poth, R. D. (2018). Social presence in online learning. In M. Marmon (Ed.), Enhancing social presence in online environments (pp. 88-116). IGI Global. https://doi.org/10.4018/978-1-5225-3229-3.ch005
PRISMA. (2025). Welcome to the PRISMA website. https://www.prisma-statement.org/
*Reio, T. G., & Crim, S. J. (2013). Social presence and student satisfaction as predictors of online enrollment intent. The American Journal of Distance Education, 27(2), 122-133. https://doi.org/10.1080/08923647.2013.775801
Rettie, R. (2003). Connectedness, awareness, and social presence [Paper]. 6th International Presence Workshop, Aalborg University, Aalborg, Denmark. https://eprints.kingston.ac.uk/id/eprint/2106/1/Rettie.pdf
Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students' satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior, 71, 402-417. https://doi.org/10.1016/j.chb.2017.02.001
Richardson, J. C., & Swan, K. (2003). An examination of social presence in online courses in relation to students' perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68-88. http://dx.doi.org/10.24059/olj.v7i1.1864
Robb, C. A., & Sutton, J. (2014). The importance of social presence and motivation in distance learning. The Journal of Technology, Management, and Applied Engineering, 30(2). 2-10.
Rothstein, H. R., Sutton, A. J., & Borenstein, M. (Eds.). (2006). Publication bias in meta-analysis: Prevention, assessment, and adjustments. John Wiley & Sons. http://dx.doi.org/10.1002/0470870168
Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (1999). Assessing social presence in asynchronous text-based computer conferencing. Journal of Distance Education, 14(2), 50-71
*Saadatmand, M., Uhlin, L., Hedberg, M., Abjornsson, L., & Kvarnstrom, M. (2017). Examining learners' interaction in an open online course through the community of inquiry framework. European Journal of Open, Distance and E-Learning, 20(1), 61-79. https://doi.org/10.1515/eurodl-2017-0004
*Shea, P., & Bidjerano, T. (2009). Cognitive presence and online learner engagement: A cluster analysis of the community of inquiry framework. Journal of Computing in Higher Education, 21(3), 199-217. https://doi.org/10.1007/s12528-009-9024-5
Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. John Wiley & Sons.
*So, H.-J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education, 51(1), 318-336. https://doi.org/10.1016/j.compedu.2007.05.009
*Song, H., Kim, J., & Park, N. (2019). I know my professor: Teacher self-disclosure in online education and a mediating role of social presence. International Journal of Human-Computer Interaction, 35(6), 448-455. https://doi.org/10.1080/10447318.2018.1455126
*Sorden, S. D., & Munene, I. I. (2013). Constructs related to community college student satisfaction in blended learning. Journal of Information Technology Education, 12, 251-270. https://doi.org/10.28945/1890
*Spears, L. R. (2012). Social presence, social interaction, collaborative learning, and satisfaction in online and face-to-face courses [Dissertation, Iowa State University].
*Swan, K., & Shih, L. F. (2019). On the nature and development of social presence in online course discussions. Online Learning, 9(3). https://doi.org/10.24059/olj.v9i3.1788
*Tan, C. (2021a). The impact of COVID-19 on student motivation, community of inquiry and learning performance. Asian Education and Development Studies, 10(2), 308-321. https://doi.org/10.1108/AEDS-05-2020-0084
*Tan, C. (2021b), The impact of COVID-19 pandemic on student learning performance from the perspectives of community of inquiry, Corporate Governance, 21(6), 1215-1228. https://doi.org/10.1108/CG-09-2020-0419
*Teng, Y. (2005). An examination of social presence in online learning through the eyes of native and non-native English speakers [Dissertation, State University of New York at Albany].
The Campbell Collaboration. (2020). Campbell systematic reviews: Policies and guidelines. Campbell Policies and Guidelines, Series No. 1.
Tu, C. H. (2001). How Chinese perceive social presence: An examination of interaction in online learning environment. Educational Media International, 38(1), 45-60. https://doi.org/10.1080/09523980010021235
Tu, C. H. (2002). The relationship between social presence and online privacy. The Internet and Higher Education, 5(4), 293-318. https://doi.org/10.1016/S1096-7516(02)00134-3
Tu, C. H., & McIsaac, M. S. (2002). The relationship of social presence and interaction in online classes. American Journal of Distance Education, 16(3), 131-150. http://dx.doi.org/10.1207/S15389286AJDE1603_2
*Turel, Y. K. (2016). Relationships between students' perceived team learning experiences, team performances, and social abilities in a blended course setting. The Internet and Higher Education, 31, 79-86. https://doi.org/10.1016/j.iheduc.2016.07.001
Valentine, J. C., Pigott, T. D., & Rothstein, H. R. (2010). How many studies do you need? A primer on statistical power for meta-analysis. Journal of Educational and Behavioral Statistics, 35(2), 215-247. https://psycnet.apa.org/doi/10.3102/1076998609346961
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.
*Weidlich, J., & Bastiaens, T. J. (2017). Explaining social presence and the quality of online learning with the SIPS model. Computers in Human Behavior, 72, 479-487. https://doi.org/10.1016/j.chb.2017.03.016
Weidlich, J., Goksün, D. O., & Kreijns, K. (2023). Extending social presence theory: social presence divergence and interaction integration in online distance learning. Journal of Computing in Higher Education, 35(3), 391-412. https://doi.org/10.1007/s12528-022-09325-2
Wiener, M., & Mehrabian, A. (1968). Language within language: Immediacy, a channel in verbal communication. Appleton-Century-Crofts.
Whiteside, A. L. (2015). Introducing the social presence model to explore online and blended learning experiences. Online Learning (Newburyport, Mass.), 19(2), 53-. https://doi.org/10.24059/olj.v19i2.453
Whiteside, A. L. (2017). Understanding social presence as critical literacy. In Whiteside, A. L., Dikkers, A. G., & Swan, K. (Eds.). Social presence in online learning: Multiple perspectives on practice and research, (pp. 133-142). Stylus Publishing.
Wiener, M., & Mehrabian, A. (1968). Language within language: Immediacy, a channel in verbal communication. Appleton-Century-Crofts.
Wise, A., Chang, J., Duffy, T., & Del Valle, R. (2004). The effects of teacher social presence on student satisfaction, engagement, and learning. Journal of Educational Computing Research, 31(3), 247-271. https://doi.org/10.2190/V0LB-1M37-RNR8-Y2U1
*Yildirim, D., & Seferoglu, S. S. (2021). Evaluation of the effectiveness of online courses based on the community of inquiry model. The Turkish Online Journal of Distance Education, 22(2), 147-163. https://doi.org/10.17718/tojde.906834
*Yin, B., & Yuan, C. H. (2021). Precision teaching and learning performance in a blended learning environment. Frontiers in Psychology, 12, 631125. https://doi.org/10.3389/fpsyg.2021.631125
Zawacki-Richter, O., Kerres, M., Bedenlier, S., Bond, M., & Buntins, K. (2020). Systematic reviews in educational research: Methodology, perspectives and application. Springer Nature. http://library.oapen.org/handle/20.500.12657/23142
*Zhan, Z., & Mei, H. (2013). Academic self-concept and social presence in face-to-face and online learning: Perceptions and effects on students' learning achievement and satisfaction across environments. Computers & Education, 69, 131-138. https://doi.org/10.1016/j.compedu.2013.07.002
is a professor at the University of Saskatchewan. Some of his current research interests include online learning program development, social presence, knowledge synthesis, scoping studies and systematic reviews. He currently instructs undergraduate, certificate, and graduate e-learning courses in special education and educational psychology and is presently engaged in SSHRC-funded research pertaining to the affective domain of e-learning. Email: david.mykota@usask.ca
Figure 1 image description: A flow diagram for the systematic review screening:
Figure 2 image description: A forest plot of the learning meta-analysis with study name, statistics for each study, and correlation and 95% CI. The statistics include correction, lower limit, upper limit, Z-value, and p-value.
Figure 3 image description: A forest plot of the satisfaction meta-analysis with study name, statistics for each study, and correlation and 95% CI. The statistics include correction, lower limit, upper limit, Z-value, and p-value.
Figure 4 image description: A funnel plot for learning that shows the publication bias. Standard error is plotted against Fisher's Z.
Figure 5 image description: A funnel plot for student satisfaction that shows the publication bias. Standard error is plotted against Fisher's Z.