International Review of Research in Open and Distributed Learning

Volume 21, Number 1

January - 2020

 

Doctoral Students’ Learning Success in Online-Based Leadership Programs: Intersection with Technological and Relational Factors

 

HyunKyung Lee1, Heewon Chang2, and Lynette Bryan2
1Hankuk University of Foreign Studies, Seoul, Korea, 2Eastern University, PA, USA

 

Abstract

This study examines how technological and relational factors independently and interactively predict the perceived learning success of doctoral students enrolled in online-based leadership programs offered in the United States. The 73-item Online Learning Success Scale (OLSS) was constructed, based on existing instruments, and administered online to collect self-reported data on three primary variables: student learning success (SLS), relational factors (RF), and technological factors (TF). The SLS variable focuses on the gain of knowledge and skills, persistence, and self-efficacy; the RF on the student-student relationship, the student-faculty relationship, and the student-non-teaching staff relationship; and the TF on the ease of use, flexibility, and usefulness. In total, 210 student responses from 26 online-based leadership doctoral programs in the United States were used in the final analysis. The results demonstrate that RF and TF separately and together predict SLS. A multiple regression analysis indicates that, while all dimensions of TF and RF are significant predictors of SLS, the strongest predictor of SLS is the student-faculty relationship. This study suggests that building relationships with faculty and peers is critical to leadership doctoral students’ learning success, even in online-based programs that offer effective technological support.

Keywords: online education, online learning success, leadership doctoral program, technological factors, relational factors

Introduction

Student learning success (SLS) is everyone’s business in higher education. Learning success among doctoral students in growing online programs is a particular concern for three reasons. First, doctoral student completion, an indicator of learning success, is known to be at a lower rate than other educational endeavors. The PhD Completion Project evaluated doctoral completion rates and attrition patterns across major universities in the United States and Canada and found that only 56.6% of students completed their programs with the lowest completion rates occurring in the social sciences and humanities (Sowell, Zhang, Redd, & King, 2008). Considering that each individual and institution embarking on the PhD journey is investing significant time, money, and intellectual resources, unsuccessful doctoral learning means a substantial waste of resources to the students themselves, their families, the faculty and staff of the institutions, and the intellectual community as a whole.

Second, online degree-granting programs, particularly at the graduate level, are growing significantly in the United States. According to the National Center for Education Statistics (2018), 31.7% of students enrolled in degree-granting postsecondary institutions in 2016 were engaged in distance or online education, either partially or fully. For graduate students, this percentage increased to 36.8%. In 2017, 239 online leadership doctoral programs were offered in the United States, according to our website search of all 50 state departments of education. Online programs provide convenience to graduate students who, while maintaining their work responsibilities, learn anywhere at any time through technology-facilitated tools such as discussion boards, web conferencing, blogging, and social networks (Alammary, Sheard, & Carbone, 2014; Hill, 2012). Online-based education is regarded as the future of higher education, and an increasing number of institutions include online programs in their long-term strategic planning (Allen, Seaman, Poulin, & Straut, 2016; Bayne, Gallagher, & Lamb, 2014). Despite the fact that online-based learning creates different challenges to the learning success of students than face-to-face learning (Kennedy, Terrell, & Lohle, 2015; Lambie, Hayes, Griffith, Limberg, & Mullen, 2014; Rockinson-Szapkiw, Wendt, Whighting, & Nisbet, 2016), the impact of technology on doctoral SLS has not been fully explored.

Third, although the modality of instruction changes, student learning needs based on relationships do not disappear even in online environments. For example, social support from family, friends, and peers has a positive impact on academic self-regulation (Akyol & Garrison, 2011; Williams, Wall, & Fish, 2019) and student learning even in technology-facilitated environments (Gardner, 2009; Garrison, 2007; Lee, 2014). Students still seek timely feedback, encouragement, and openness as they explore new concepts through productive online dialogue with peers and instructors (Bolliger & Halupa, 2012; Kumar, 2014). In addition, interactions with staff are indicators of service quality and have a direct impact on student loyalty and satisfaction (Martínez-Argüelles & Batalla-Busquets, 2016; Ravindran & Kalpana, 2012).

Considering these problems, this study intends to explore how technological factors (TF) and relational factors (RF) predict doctoral SLS in U.S. online-based leadership programs. The purpose of this study is explored with the following research questions:

  1. How do technological factors and relational factors separately and interactively predict doctoral student learning success in online-based leadership programs?
  2. Which subfactors of the technological and relational factors are the best predictors of doctoral student learning success in online-based leadership programs?

Theoretical Framework

Three constructs—technological factors (TF), relational factors (RF), and student learning success (SLS)—make up the theoretical framework of this study. The relationship among these constructs is represented as follows:


Figure 1. Relationship among the three constructs of this study.

Technological Factors

Colleges and universities use technology at various degrees to create online learning environments. Some instruction is delivered fully online, heavily relying on embedded technological features, while others use technology to complement face-to-face instruction. Despite some variations, the common thread is a focus on technology as an integral means of providing instruction. A review of the literature highlights three aspects of technology-facilitated instruction: flexibility, usefulness, and ease of use (Arbaugh, 2000; Bures, Abrami, & Amundsen, 2000; Hart, 2012).

Flexibility, the first technological subfactor, allows students to pursue degrees across geographical, cultural, professional, and generational borders (Bolliger & Halupa, 2012; Sampson, Leonard, Ballenger, & Coleman, 2010). Although doctoral students in online-based programs require discipline and independence to be academically successful, these potential challenges are outweighed by the convenience of utilizing technology to access quality conversations with professors and peers from a distance, while balancing work obligations and family responsibilities with a flexible schedule of academics (Erichsen, Bolliger, & Halupa, 2014; Garrison, 2007). Arbaugh (2000) argued that online learning transcending time and location restriction would enable participants to reach levels of relational intimacy comparable to face-to-face groups, albeit over a longer time period.

Usefulness, the second technological subfactor, refers to the degree to which the technology can enrich and enhance the learning experience (Davis, 1989). Both usefulness and accessibility contribute to the effectiveness of technology (Edmunds, Thorpe, & Conole, 2012; Joo, Lim, & Kim, 2011). A study by Edmunds, Thorpe, and Conole (2012) of 421 university students in the United Kingdom found that the perceived usefulness of technology predicted the actual use of technology for work, school, and social reasons. Arbaugh’s (2000) student satisfaction study discovered that graduate management education students who believed technology was valuable and perceived it to be easy to use were more likely to engage in technology for their degree work.

Ease of use, the third technological subfactor, refers to the degree to which technology can be used without undue effort or distraction (Davis, 1989). Ease of use was determined as a critical element affecting student acceptance of technology. A study of technology as a method of course delivery in a study of 136 students in a full-time online-based college program found that student attitude was the most important determinant of the acceptance of technology as a learning tool (Cheung & Vogel, 2013). A positive mindset about technology as a flexible, valuable, and easy-to-use resource motivates toward intentional use of as a means of developing relationships (Davis, 1989; Edmunds et al., 2012; Joo et al., 2011).

Relational Factors

Educational theorists have historically pointed to the integration of academics with social involvement and engagement as critical to student retention up to and including graduation (Tinto, 1999). The community of inquiry framework emphasizes the importance of social presence even when technology is used for learning. It is argued that the social, cognitive, and teaching presence interactively create deep meaning in an academic environment that is mediated by technology (Akyol & Garrison, 2011; Arbaugh et al., 2008; Garrison, 2007; Lai, 2015; Shea & Bidjerano, 2009). The online delivery of instruction does not negate the need for building a sense of school community to increase student satisfaction and retention, but simply changes the methods used to interact (Roach & Lemasters, 2006). As RF, three types of relationships were examined for this study: student-student, student-faculty, and student-non-teaching staff.

Student-student interaction, the first relational subfactor, is considered critical to the individual cognitive development of students in an online higher education environment according to Shea and Bidjerano (2009). A study of graduates’ reflections on an online-based doctorate in educational technology determined that well-selected readings, open-ended questions, and guided conversations were influential in promoting interaction between students and critical thinking about the subject matter (Fuller, Risner, Lowder, Hart, & Bachenheimer, 2014). A quantitative content analysis of discussion board messages from two groups of college students found that the online discussion board was an effective means of developing community, which enabled individual members to reason through the topics and construct thought (Lee, 2014).

Student-faculty interaction, the second relational subfactor, has been determined to be the most critical aspect of student satisfaction. The qualities being sought after by the students included timely feedback and responsiveness to questions, attentiveness, encouragement, and sincerity (Bolliger & Halupa, 2012). A study of second-year doctoral students found that 90% of the students credited the instructors for facilitating productive dialogue and providing timely feedback that encouraged the exploration of new concepts (Kumar, 2014). In addition, the student-faculty interaction also influences the future enrollment of the program because their satisfaction is translated into their willingness to recommend the program to others (Martínez-Argüelles & Batalla-Busquets, 2016).

The last relational subfactor, student to non-teaching staff, was also found to be as important to overall satisfaction within online-based higher education programs. Contact personnel in departments such as registration and records are an influencing consideration in student evaluation of the service quality of the university. This satisfaction in service quality leads to the retention and success of students (Ravindran & Kalpana, 2012; Sohail & Shaikh, 2004).

Student Learning Success

The success of the doctoral student is typically culminated by the completion of the dissertation and the attainment of the doctoral degree. However, a deeper exploration of student success addresses academic achievement; engagement in educationally purposeful activities; satisfaction; acquisition of desired knowledge, skills, and competencies; persistence; attainment of educational objectives; and post-college performance (Im & Kang, 2019; Kuh, Kinzie, Schuh, & Whitt, 2010). While educational “success” has been broadly and often studied, York, Gibson, and Rankin (2015) acknowledged a lack of comprehensive instrumentation for measuring success outside of academic achievements such as grades, GPA, and degree attainment. This study created a tool to focus on three specific indicators to predict SLS, or perceived success, in doctoral endeavors by focusing on the gain of knowledge and skills, self-efficacy, and persistence. All of these are shown to lead to degree completion, which is the ultimate measure of student success (Gardner, 2009; Ivankova & Stick, 2007; Lambie et al., 2014).

Beyond the earned degree, success for doctoral programs is defined as the gain of knowledge and skills which will allow the student to think critically and creatively (Gardner, 2009). A survey of 131 graduate students found that students who actively engaged in the online learning community both socially and cognitively had a greater sense of perceived scholarship that contributed to their course success (Rockinson-Szapkiw et al., 2016). For doctoral students, active engagement in learning, resulting in the perceived gain of knowledge and skills, is considered critical to developing self-efficacy.

Successful completers of doctoral programs are likely to be students who believe in their own ability to conduct empirical research and successfully write research findings. An exploratory investigation of PhD education students found the self-efficacy of students increased with the completion of classes and involvement in research opportunities (Lambie et al., 2014). Bandura (1997) equates self-efficacy with a person’s choices, goals, expended effort, and willingness to persist in the face of adversity. Self-efficacy can cause students to either obstruct their own progress through self-destructive stress or raise a student above the academic demands to reach accomplishments beyond what they thought they could do (Bures et al., 2000; Lee & Mao, 2016).

Persistence, leading to degree completion, is considered a measure of institutional and programmatic success. The rate of doctoral students who fail to earn their PhDs is approximately 50% in the social science, humanities, and educational arenas. This number goes 10% to 15% higher for students enrolled in technology-based programs (Kennedy et al., 2015). In an online-based learning environment, mentoring and faculty support allow the doctoral student to persist in independently conducting, analyzing, and presenting research in completion of the doctoral program (Ampaw & Jaeger, 2012; Erichsen et al., 2014).

In summary, the theoretical framework of this study connects three constructs: TF, RF, and SLS. The first construct, TF, which serves as an independent variable, consists of three subfactors: flexibility, usefulness, and ease of use. The second construct, RF, also serves as an independent variable and focuses on student-student, student-faculty, and student-non-teaching staff relationship. Finally, the construct of SLS, the dependent variable, consists of three subfactors: gain of knowledge and skills, successful completion, and persistence. The relationship between this dependent variable of SLS and two independent variables—TF and RF—was established based on the studies discussed in this section.

Methods

This correlational study engaged 210 doctoral students from 26 online-based leadership doctoral programs in the United States. This section describes the context, participants, instruments, and data collection and analysis in detail.

Context and Participants

This study involved doctoral students from programs that offer a PhD, EdD, or PsyD with “leadership” in their degree titles and that deliver instruction in fully or partially online environments. All leadership doctoral programs in U.S. higher education institutions were identified, drawing upon doctoral program directories, compiled and shared by individual leadership scholars or organizations, and websites of all 50 state higher education agencies. Website information on each program was examined to determine if learning was delivered online. If not readily identified, further investigation was done, including an examination of course catalogs or schedules. It must be recognized that, while extensive, the Web search was only as accurate as the information provided on the website of each university. The demographics of the respondents are summarized in Table 1.

Table 1

Participant Demographics (N = 210)

Category Characteristic Frequency (%)
Gender Male 70 (33)
Female 140 (67)
Age 20-29 15 (7)
30-39 70 (33)
40-49 64 (31)
50-59 53 (25)
60+ 8 (4)
Status First year 54 (26)
Midway through coursework 70 (33)
Dissertation phase 68 (32)
Dissertation completed 18 (9)
Degree PhD 36 (17)
EdD 169 (81)
PsyD 5 (2)
Discipline Education 183 (87)
Business/Management 15 (7)
Other leadership 12 (6)
Delivery 100% online 54 (26)
Blended instruction: 50% or more online 96 (46)
Primarily face-to-face classroom instruction 57 (27)
Other 3 (1)

Instruments

The “Online Learning Success Scale (OLSS)” was constructed, drawing upon nine existing scales listed in the “References” column in Table 2. OLSS measures three major variables: technological factors, relational factors, and student learning success with three subfactors for each variable (see the “Factors” and “Subfactors” columns in Table 2). Some conceptual categories and questions were modified to measure the constructs intended for the study and doctoral leadership contexts. Cronbach’s alpha was used to measure the reliability of each variable of the final OLSS (see the “Reliability Coefficients” column in Table 2). The reliability values of the factors ranged from.936 to.949, and those of the subfactors ranged from .857 to .967.

The first independent variable, TF, consists of three subfactors: (a) usefulness, (b) flexibility, and (c) ease of use. “Usefulness” refers to the positive impact of the online delivery system on students’ learning and doctoral experience; “flexibility” to the advantages of using a technological tool to overcome time and geographic limitations; and “ease of use” to the minimal effort involved in engaging within an online platform. The second independent variable, RF, consists of three subfactors: (a) student-student relationship, (b) student-faculty relationship, and (c) student-non-teaching staff relationship. “Student-student relationship” refers to students’ connectedness with their peers and the feeling of community within their leadership program; “student-faculty” to students’ connectedness and ability to communicate with faculty; and “student-non-teaching staff relationship” to students’ connectedness with and perceived helpfulness of the non-teaching staff.

The dependent variable, SLS, consists of three subfactors, (a) gain of knowledge and skills, (b) self-efficacy, and (c) persistence. “Gain of knowledge and skills” refers to students’ perceived gain of knowledge and skills pertaining to leadership; “self-efficacy” to their ability to apply their knowledge and skills to their leadership practice and to conduct original research; and “persistence” to their commitment to finishing the program in their current institution.

Table 2

Online Learning Success Scale Information and Reliability Coefficients

Constructs Subfactors References No. of items Reliability coefficients
Technological Factors (TF) Usefulness (TF_US) Student satisfaction scale (Arbaugh, 2000) 6 .895
Flexibility (TF_FL) Student satisfaction scale (Arbaugh, 2000) 6 .887
Ease of Use (TF_EU) Student satisfaction with e-learning instrument (Bures et al., 2000) 10 .901
Relational Factors (RF) Student-Student (RF_SS) Classroom community scale (Rovai, 2002); Community of inquiry (Akyol & Garrison, 2011) 11 .967
Student-Faculty (RF_SF) Student-faculty communication questionnaire (Liu, Rau, & Schulz, 2014); Six elements of measuring relationships (Cho & Auger, 2013) 8 .892
Student-Non-Teaching Staff (RF_SN) Six elements of measuring relationships (Cho & Auger, 2013) 8 .966
Student Learning Success (SLS) Gain of Knowledge and Skills(SLS_KS) Alavi’s perceived student learning scale (Alavi, 1994; Williams, Duray, & Reddy, 2006) 6 .857
Self-Efficacy (SLS_SE) Foundation practice self-efficacy scale (Holden, Anastas, & Meenaghan, 2003) 7 .869
Persistence (SLS_PE) College persistence questionnaire (Davidson, Beck, & Milligan, 2009) 11 .901

Data Collection and Analyses

The OLSS was transposed in Qualtrics, an online survey software for collecting data. An introduction, containing the link to the survey, was sent via email to a comprehensive list of 239 online-based leadership doctoral program directors in three rounds of distribution with one reminder for each round. Program directors who accepted the participation invitation sent the survey link to their students and recent graduates. Directors who did not act on our invitation either did not communicate with the researchers at all or cited various reasons for their decline, such as institution IRB rules, too many study requests, program not beginning until the next year, lack of program participation, and lack of online components in the program.

Initially, 276 respondents participated in the survey. Two respondents did not consent to participate and 39 indicated that they were not currently enrolled in a doctoral leadership program. Of the remaining responses, 210 fully completed responses from 26 programs were included in the final analysis. Participants responded to the survey statements using a 5-point Likert scale with the rating of 5 meaning strong agreement with the statement. Examples of survey statements include: “I can apply critical thinking skills within the context of leadership practice”; “Small group online activities improve the quality of my education in the doctoral program”; “Getting to know the other students gave me a sense of belonging in the doctoral program.”

Descriptive statistics were applied for the initial analysis of three variables: TF, RF, and SLS. The normality of the data used in the analysis was confirmed, and Pearson correlation coefficients, which are used when the data are parametric and normally distributed, were analyzed to examine the relationship among these variables. A multiple linear regression analysis was also used to identify the effects of TF and RF on SLS in online-based doctoral leadership programs. The statistical analysis of data collected from the study was conducted with Statistical Package for the Social Sciences (SPSS) 21.0 program.

Results

This section reports three types of results: descriptive statistics, predictability of TF and RF on SLS, and effects of technological and relational subfactors on SLS.

Descriptive Statistics

Various subdivisions of 210 responses to the survey represent slightly different pictures of SLS, RF, and TF. Table 3 provides means and standard deviations of these three variables by gender, age, instruction delivery model, and students’ status; however, no statistical significance could be tested due to significantly unequal sizes of subdivisions.

Although there were no significant mean differences in gender, age, delivery model, and students’ status, the mean scores of the TF differ by delivery models: 100% online model (4.01), blended model (3.86), and primarily face-to-face (3.42). In terms of RF, the mean score of 100% online students was 3.87, lower than the mean score of 4.21 of respondents in blended and face-to-face programs. The similar importance of the RF for respondents in blended and face-to-face programs was reinforced through text answers provided in the survey.

Table 3

Means and Standard Deviations of the Three Variables (SLS, TF, and RF)

Group Characteristic Frequency (%) SLS TF RF
Mean SD Mean SD Mean SD
Gender Male 70 (33.3) 4.40 .48 3.77 .79 4.21 .58
Female 140 (66.7) 4.36 .45 3.79 .66 4.08 .56
Age 20-29 15 (7.1) 4.35 .41 3.64 .70 3.91 .47
30-39 70 (33.3) 4.26 .49 3.67 .74 4.00 .56
40-49 64 (30.5) 4.46 .44 3.86 .66 4.25 .58
50-59 53 (25.2) 4.40 .45 3.80 .71 4.16 .56
60+ 8 (3.8) 4.54 .38 4.26 .49 4.35 .43
Delivery model 100% online 54 (25.7) 4.28 .48 4.01 .63 3.87 .59
Blended 96 (45.7) 4.39 .45 3.86 .62 4.21 .54
Primarily face-to-face 57 (27.1) 4.43 .45 3.42 .77 4.21 .54
Other 3 (1.4) 4.48 .34 3.83 .64 4.23 .27
Status in coursework First year 54 (25.7) 4.24 .48 3.69 .70 4.14 .55
Mid-coursework 70 (33.3) 4.38 .43 3.62 .73 4.11 .54
Dissertation phase 68 (32.4) 4.44 .48 3.92 .66 4.06 .62
Dissertation completed 18 (8.6) 4.51 .34 4.17 .52 4.17 .48

Note. SLS (student learning success), TF (technological factor), RF (relational factor).

To investigate the relationships among TF, RF, and SLS, Pearson’s correlation coefficients were calculated. Table 4 presents the correlation coefficients for the subvariables of the three main variables: i.e., TF, RF, and SLS. The correlational results from the survey indicated that all subfactors in SLS were significantly correlated with all subfactors of both TF and RF in a positive direction except for two subvariables. Namely, correlations between the flexibility of TF and the student-student relationship of RF (r = .052) and between the flexibility of TF and the student-faculty relationship of RF (r = .13) were not statistically significant. Moreover, stronger correlations were found between RF and SLS compared to between TF and SLS. The most significant positive correlation was noted between the persistence of SLS and the student-faculty relationship of RF (r = .777, p < .01).

Table 4

Correlations of the Three Variables (SLS, TF, and RF)

1 2 3 4 5 6 7 8 9
SLS_KS 1 1
SLS_SE 2 .708** 1
SLS_PE 3 .641** .573** 1
TF_US 4 .297** .341** .367** 1
TF_FL 5 .215** .237* .168* .619** 1
TF_EU 6 .282** .321** .259** .784** .730** 1
RF_SS 7 .413** .355** .566** .195** 0.052 .221** 1
RF_SF 8 .438** .391** .777** .295** 0.13 .199** .478** 1
RF_SN 9 .319** .269** .470** .339** .168* .280** .273** .451** 1
M 4.36 4.37 4.39 3.68 3.97 3.69 4.23 4.11 4.02
SD .53 .48 .56 .78 .84 .72 .74 .67 .78

Note. SLS (student learning success), TF (technological factors), RF (relational factors), KS (gain of knowledge and skills), SE (self-efficacy), PE (persistence), US (usefulness), FL (flexibility), EU (ease of use), SS (student-student), SF (student-faculty), SN (student-non-teaching staff).
* p < .05, ** p < .01

The significant correlations among the variables do not mean that all the variables have casual relationships, and thus it is necessary to undertake regression analysis to examine the relationships among the variables.

Predictability of Technological and Relational Factors on Student Learning Success

This section presents the results of Research Question 1: How do technological factors and relational factors separately and interactively predict doctoral student learning success in online-based leadership programs? Multiple regression analysis was conducted to determine if technological and relational factors affected student learning success significantly in terms of gain of knowledge and skills, self-efficacy, and persistence.

According to the results of the multiple regression analysis, TF and RF together significantly predicted SLS (R2 = .465, F = 89.903, p =.000). Moreover, TF and RF respectively affected SLS significantly. RF (t = 11.382, p = .000) especially affected SLS more significantly than TF (t = 3.209, p = .002). In addition, if a variance inflation factor (VIF) is 10 or more, it is assumed that there is a multicollinearity (Kutner, Nachtsheim, & Neter, 2004), and thus there is no multicollinearity between TF and RF (VIF < 10; see Table 5).

Table 5

Effects of Technological and Relational Factors on Student Learning Success

Independent variables Unstandardized coefficient Standardized coefficient t p VIF
B Std. error β
(Constant) 1.924 .187 10.271 .000
Technological factors (TF) .112 .035 .172 3.209** .002 1.106
Relational factors (RF) .491 .043 .609 11.382*** .000 1.106
F 89.903***
R2 (adj. R2) .465 (.460)

** p <.01, *** p <.001

Effects of Technological and Relational Subfactors on Student Learning Success

This section reports on the results in response to Research Question 2: Which subfactors of the technological and relational factors are the best predictors of doctoral student learning success in online-based leadership programs? To identify which subfactors of the technological and relational factors were the best predictors of student learning success, another multiple regression analysis was performed.

According to the results of the multiple regression analysis, all the subfactors of both TF and RF predicted SLS significantly (R2 = .500, F = 33.867, p =.000). However, the technological subfactors—usefulness, flexibility, and ease of use—and one relational subfactor between the student and the non-teaching staff separately did not predict SLS. Only two of the relational subfactors, namely student-student relationship (t = 4.436, p = .000) and student-faculty relationship (t = 6.591, p = .000), were statistically significant regarding the effects on SLS. The student-faculty relationship particularly was the best predictor of SLS (t = 6.591, p = .000). There is no multicollinearity between the subfactors of the TF and the RF (VIF < 10; see Table 6).

Table 6

Effects of Subfactors of Both Technological and Relational Factors on Student Learning Success

Independent variables Unstandardized coefficient Standardized coefficient t p VIF
B Std. error β
(Constant) 1.871 .188 9.972 .000
Technological factors (TF) Usefulness (TF_US) .066 .048 .114 1.373 .171 2.791
Flexibility (TF_FL) .024 .040 .044 .602 .548 2.198
Ease of Use (TF_EU) .034 .060 .053 .566 .572 3.611
Relational factors (RF) Student-Student (RF_SS) .160 .036 .258 4.436*** .000 1.376
Student-Faculty (RF_SF) .278 .042 .409 6.591*** .000 1.562
Student-Non-teaching Staff (RF_SN) .054 .034 .093 1.607 .110 1.356
F 33.867***
R2 (adj. R2) .500 (.485)

*** p <.001

Discussion and Conclusion

Faced with the increasing importance of distance learning as a preferred means of obtaining a degree at the graduate levels, including the doctoral level (National Center for Education Statistics, 2018), all higher education institutions and programs must consider the impact of technology and relationships, individually and interactively, within the online environment. The intent of this study was to determine how TF and RF related to the SLS of students engaged in the U.S. doctoral leadership programs.

An analysis of the data collected from this study found that significant correlations exist, which confirms the importance of both technology and human relationships in the learning success of doctoral students in online-based learning environments. Persistence, students’ determination to continue to completion, is most significantly related to RF for respondents from blended or 100% online programs. This result corresponds with similar studies that have established connectedness and social integration as critical to the likelihood of doctoral students persisting within the coursework and candidacy stages of the program (Kennedy et al., 2015; Martínez-Argüelles & Batalla-Busquets, 2016; Rockinson-Szapkiw et al., 2016).

Another finding indicates that all three RF (student-student, student-faculty, and student-non-teaching staff) were collectively and separately better predictors than the TF for doctoral SLS. Whether the success is defined as a gain of knowledge and skills, self-efficacy, or persistence, these results concur with similar studies (Kennedy et al., 2015; Lambie et al., 2014; Rockinson-Szapkiw et al., 2016). Interviews with doctoral students at a research-intensive university in New Zealand found that technology was an effective means of facilitating the development of learning communities to construct meaningful knowledge and share individual experiences (Lai, 2015).

Technology is important, but it seems to be a means to the end of student learning, secondary to relationships. Our study found that the student-faculty relationship was the subfactor with the strongest predicting power to SLS. The instructor is the pivotal participant in the online learning experience, helping to facilitate productive dialogue, encouraging the exploration of new concepts, and providing timely feedback (Augustsson & Jaldemark, 2013; Kumar, 2014). An integrated literature review by Hart (2012) identified connectedness, belonging, and support as important factors that went beyond the content to motivating students to overcome hardships and persist in the online-based environment. A grounded theory study of students in a limited-residency program found that the greatest factor for not completing the doctoral work, especially in the dissertation phase, was a lack of supportive interaction (Kennedy et al., 2015).

This is not to negate the correlation of TF with student success. Of the three subfactors of SLS, self-efficacy correlated most significantly with the TF of blended or online learning. One explanation is that this study surveyed doctoral students who have already experienced academic success. A meta-analysis of within-person self-efficacy found that self-efficacy was a product of past performance rather than a predictor of future performance (Sitzmann & Yeo, 2013). The self-efficacy of doctoral students then increases as courses are completed and aligned with research opportunities (Lambie et al., 2014). The very definition of self-efficacy involves the ability of an individual to identify the contexts for which the individual has the skills and ability to succeed (Celik & Yesilyurt, 2013). There is an integration of knowledge and skills that doctoral leadership programs should be aware of to create successful technology-based learning opportunities that are associated with increased self-efficacy.

In summary, the results from this study lead to a conclusion that both TF and RF predict learning success as perceived by students enrolled in online-based doctoral leadership programs in the United States. The study found that RF predict SLS better than TF, particularly the student-faculty relationship. Distance education programs must purposefully develop support systems, such as the cohort model, that encourage connectedness and social integration (Kennedy et al., 2015; Williams et al., 2019). Administrators, faculty, and staff of distance education programs must be prepared to facilitate communication using technology, and understand the importance of timely responses to students at all phases of the doctoral program (Gardner, 2009; Rockinson-Szapkiw et al., 2016).

This study has several limitations that might have affected the findings. Regarding the program and participant selection, the study had limited data caused by several uncontrollable conditions. Information on the individual institution websites was often incomplete or outdated, which made it difficult to accurately determine the online nature of the programs. This difficulty was compounded by the whole spectrum of terms that can be used that describe an online program (Anohina, 2005). In addition, a good number of eligible programs or participants were inaccessible due to their institutional or programmatic constraints and unresponsiveness of directors or student participants.

While this study found no difference by gender, status, or age, there was a gender imbalance with two-thirds of the respondents being female. A review of the literature finds mixed results with regards to gender and relational preference. A study of 12 online-based graduate courses found that female students felt more connected with their peers and perceived that they learned more than their male counterparts in the courses, while a study of students in Taiwan found that the differences were related to status in the college program (Hung, Chou, Chen, & Own, 2010; Rovai & Baker, 2005). Other studies, like this one, found no differences in the success or satisfaction of students by gender, status, or age (Cho & Kim, 2013; Martin, 2005). Lastly, this study engaged only programs based in the United States, creating an issue of the difficulty of generalization. However, similar studies in different contexts also have concluded that relationships are the critical factor in the success of students in the online-based educational environment (Fuller et al., 2014; Lai, 2015; Roach & Lemasters, 2006; Sohail & Shaikh, 2004).

Based on these limitations, further study is recommended to engage a more balanced set of participants by gender, age, and degree type. Secondly, further study could expand the research beyond the leadership discipline or the U.S. context, between disciplines, or among different contexts. Thirdly, qualitative studies around online doctoral leadership programs could provide a holistic understanding of programs and doctoral SLS by gaining multiple perspectives from program directors, faculty, students, and alumni beyond pre-selected variables such as TF and RF.

References

Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, 42(2), 233-250. doi: 10.1111/j.1467-8535.2009.01029.x

Alammary, A., Sheard, J., & Carbone, A. (2014). Blended learning in higher education: Three different design approaches. Australasian Journal of Educational Technology, 30(4), 440-454. doi: 10.14742/ajet.693

Alavi, M. (1994). Computer-mediated collaborative learning: An empirical evaluation. MIS Quarterly, 18, 159-174.

Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: Tracking online education in the United States. Newburyport, MA: Online Learning Consortium. Retrieved from https://files.eric.ed.gov/fulltext/ED572777.pdf

Ampaw, F. D., & Jaeger, A. J. (2012). Completing the three stages of doctoral education: An event history analysis. Research in Higher Education, 53(6), 640-660. Retrieved from http://www.jstor.org/stable/23257602

Anohina, A. (2005). Analysis of the terminology used in the field of virtual learning. Educational Technology & Society, 8(3), 91-102. Retrieved from https://www.learntechlib.org/p/74943/

Arbaugh, J. B. (2000). Virtual classroom characteristics and student satisfaction with internet-based MBA courses. Journal of Management Education, 24(1), 32-54. doi: 10.1177/105256290002400104

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3-4), 133-136. doi: 10.1016/j.iheduc.2008.06.003

Augustsson, G., & Jaldemark, J. (2013). Online supervision: A theory of supervisors’ strategic communicative influence on student dissertations. Higher Education, 67(1), 19-33. doi: 10.1007/s10734-013-9638-4

Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman and Company.

Bayne, S., Gallagher, M. S., & Lamb, J. (2014). Being “at” university: The social topologies of distance students. Higher Education, 67(5), 567-583. Retrieved from http://sianbayne.net/wp-content/uploads/2013/08/Being-at-University-draft.pdf

Bolliger, D. U., & Halupa, C. (2012). Student perceptions of satisfaction and anxiety in an online doctoral program. Distance Education, 33(1), 81-98. Retrieved from https://www.learntechlib.org/p/111288/

Bures, E. M., Abrami, P. C., & Amundsen, C. (2000). Student motivation to learn via computer conferencing. Research in Higher Education, 41(5), 593-621. doi: 10.1023/A:1007071415363

Celik, V., & Yesilyurt, E. (2013). Attitudes to technology, perceived computer self-efficacy and computer anxiety as predictors of computer supported education. Computers & Education, 60(1), 148-158. doi: 10.1016/j.compedu.2012.06.008

Cheung, R., & Vogel, D. (2013). Predicting user acceptance of collaborative technologies: An extension of the technology acceptance model for e-learning. Computers & Education, 63, 160-175. doi: 10.1016/j.compedu.2012.12.003

Cho, M., & Auger, G. A. (2013). Exploring determinants of relationship quality between students and their academic department perceived relationship investment, student empowerment, and student–faculty interaction. Journalism & Mass Communication Educator, 68(3), 255-268. doi: 10.1177/1077695813495048

Cho, M. H., & Kim, B. J. (2013). Students’ self-regulation for interaction with others in online learning environments. The Internet and Higher Education, 17, 69-75. doi: 10.1016/j.iheduc.2012.11.001

Davidson, W. B., Beck, H. P., & Milligan, M. (2009). The college persistence questionnaire: Development and validation of an instrument that predicts student attrition. Journal of College Student Development, 50(4), 373-390. doi: 10.1353/csd.0.0079

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319-340. doi: 10.2307/249008

Edmunds, R., Thorpe, M., & Conole, G. (2012). Student attitudes towards and use of ICT in course study, work and social activity: A technology acceptance model approach. British Journal of Educational Technology, 43(1), 71-84. doi: 10.1111/j.1467-8535.2010.01142.x

Erichsen, E. A., Bolliger, D. U., & Halupa, C. (2014). Student satisfaction with graduate supervision in doctoral programs primarily delivered in distance education settings. Studies in Higher Education, 39(2), 321-338. doi: 10.1080/03075079.2012.709496

Fuller, J. S., Risner, M. E., Lowder, L., Hart, M., & Bachenheimer, B. (2014). Graduates’ reflections on an online doctorate in educational technology. TechTrends, 58(4), 73-80. doi: 10.1007/s11528-014-0771-4

Gardner, S. K. (2009). Conceptualizing success in doctoral education: Perspectives of faculty in seven disciplines. The Review of Higher Education, 32(3), 383-406. doi: 10.1353/rhe.0.0075

Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks, 11(1), 61-72. doi: 10.1.1.904.2942

Hart, C. (2012). Factors associated with student persistence in an online program of study: A review of the literature. Journal of Interactive Online Learning, 11(1), 19-42. Retrieved from https://files.eric.ed.gov/fulltext/EJ842688.pdf

Hill, P. (2012). Online educational delivery models: A descriptive view. Educause Review, 47(6), 85-97. Retrieved from https://er.educause.edu/articles/2012/11/online-educational-delivery-models--a-descriptive-view

Holden, G., Anastas, J., & Meenaghan, T. (2003). Determining attainment of the EPAS foundation program objectives: Evidence for the use of self-efficacy as an outcome. Journal of Social Work Education, 39(3), 425-440. doi: 10.1080/10437797.2003.10779147

Hung, M. L., Chou, C., Chen, C. H., & Own, Z. Y. (2010). Learner readiness for online learning: Scale development and student perceptions. Computers & Education, 55(3), 1080-1090. doi: 10.1016/j.compedu.2010.05.004

Im, T., & Kang, M. (2019). Structural relationships of factors which impact on learner achievement in online learning environment. International Review of Research in Open and Distributed Learning, 20(1), 111-124. doi: 10.19173/irrodl.v20i1.4012

Ivankova, N. V., & Stick, S. L. (2007). Students’ persistence in a distributed doctoral program in educational leadership in higher education: A mixed methods study. Research in Higher Education, 48(1), 93-135. doi: 10.1007/s11162-006-9025-4

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictors in a structural model. Computers & Education, 57(2), 1654-1664. doi: 10.1016/j.compedu.2011.02.008

Kennedy, D. H., Terrell, S. R., & Lohle, M. (2015). A grounded theory of persistence in a limited-residency doctoral program. The Qualitative Report, 20(3), 215-230. Retrieved from https://nsuworks.nova.edu/tqr/vol20/iss3/5

Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J. (2010). Student success in college: Creating conditions that matter. San Francisco, CA: Jossey-Bass.

Kumar, S. (2014). Signature pedagogy, implementation and evaluation of an online program that impacts educational practice. The Internet and Higher Education, 21, 60-67. doi: 10.1016/j.iheduc.2013.11.001

Kutner, M. H., Nachtsheim, C. J., & Neter, J. (2004). Applied linear regression models (4th ed.). Boston, MA: McGraw-Hill/Irwin.

Lai, K. W. (2015). Knowledge construction in online learning communities: A case study of a doctoral course. Studies in Higher Education, 40(4), 561-579. doi: 10.1080/03075079.2013.831402

Lambie, G. W., Hayes, B. G., Griffith, C., Limberg, D., & Mullen, P. R. (2014). An exploratory investigation of the research self-efficacy, interest in research, and research knowledge of Ph.D. in education students. Innovative Higher Education, 39(2), 139-153. doi: 10.1007/s10755-013-9264-1

Lee, P. C., & Mao, Z. (2016). The relation among self-efficacy, learning approaches, and academic performance: An exploratory study. Journal of Teaching in Travel & Tourism, 16(3), 178-194. doi: 10.1080/15313220.2015.1136581

Lee, S. M. (2014). The relationships between higher order thinking skills, cognitive density, and social presence in online learning. The Internet and Higher Education, 21, 41-52. doi: 10.28945/3418

Liu, J., Rau, P. L. P., & Schulz, B. (2014). Culture and student-faculty communication in higher education: Implications for the design of educational communication tools. In P. L. P. Rau (Ed.), Cross-cultural design (pp. 563-573). New York, NY: Springer International Publishing.

Martin, K. (2005). Self-efficacy as an evaluation measure for programs in support of online learning literacies for undergraduates. The Internet and Higher Education, 8(4), 307-322. doi: 10.1016/j.iheduc.2005.09.004

Martínez-Argüelles, M. J., & Batalla-Busquets, J. M. (2016). Perceived service quality and student loyalty in an online university. The International Review of Research in Open and Distributed Learning, 17(4), 264-279. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/2518/3788

National Center for Education Statistics. (2018 ). Integrated postsecondary education data system (IPEDS): Table 311.15 [Data file]. Retrieved from https://nces.ed.gov/programs/digest/d17/tables/dt17_311.15.asp

Ravindran, S. D., & Kalpana, M. (2012). Student’s expectation, perception and satisfaction towards the management educational institutions. Procedia Economics and Finance, 2, 401-410. doi: 10.1016/S2212-5671(12)00102-5

Roach, V., & Lemasters, L. (2006). Satisfaction with online learning: A comparative descriptive study. Journal of Interactive Online Learning, 5(3), 317-332. Retrieved from http://www.ncolr.org/jiol/issues/pdf/5.3.7.pdf

Rockinson-Szapkiw, A. J., Wendt, J., Whighting, M., & Nisbet, D. (2016). The predictive relationship among the community of inquiry framework, perceived learning and online, and graduate students’ course grades in online synchronous and asynchronous courses. The International Review of Research in Open and Distributed Learning, 17(3), 18-35. doi: 10.19173/irrodl.v17i3.2203

Rovai, A. P. (2002). Development of an instrument to measure classroom community. The Internet and Higher Education, 5(3), 197-211. doi: 10.1016/S1096-7516(02)00102-1

Rovai, A. P., & Baker, J. D. (2005). Gender differences in online learning: Sense of community, perceived learning, and interpersonal interactions. Quarterly Review of Distance Education, 6(1), 31-44. Retrieved from https://www.learntechlib.org/p/106724/

Sampson, P. M., Leonard, J., Ballenger, J. W., & Coleman, J. C. (2010). Student satisfaction of online courses for educational leadership. Online Journal of Distance Learning Administration, 13(3). Retrieved from https://www.westga.edu/~distance/ojdla/Fall133/sampson_ballenger133.html

Shea, P., & Bidjerano, T. (2009). Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education. Computers & Education, 52(3), 543-553. doi: 10.1016/j.compedu.2008.10.007

Sitzmann, T., & Yeo, G. (2013). A meta-analytic investigation of the within-person self-efficacy domain: Is self-efficacy a product of past performance or a driver of future performance? Personnel Psychology, 66(3), 531-568. doi: 10.1111/peps.12035

Sohail, M. S., & Shaikh, N. M. (2004). Quest for excellence in business education: A study of student impressions of service quality. International Journal of Educational Management, 18(1), 58-65. doi: 10.1108/09513540410512163

Sowell, R., Zhang, T., Redd, K., & King, M. (2008). Ph.D. completion and attrition: Analysis of baseline program data from the Ph.D. Completion Project. Washington, DC: Council of Graduate Schools. Retrieved from https://cgsnet.org/phd-completion-and-attrition-analysis-baseline-program-data-phd-completion-project

Tinto, V. (1999). Taking retention seriously: Rethinking the first year of college. NACADA Journal, 19(2), 5-9. doi: 10.12930/0271-9517-19.2.5

Williams, E. A., Duray, R., & Reddy, V. (2006). Teamwork orientation, group cohesiveness, and student learning: A study of the use of teams in online distance education. Journal of Management Education, 30(4), 592-616. doi: 10.1177/1052562905276740

Williams, P. E., Wall, N., & Fish, W. (2019). Mid-career adult learners in an online doctoral program and the drivers of their academic self-regulation: The importance of social support and parent education level. International Review of Research in Open and Distributed Learning, 20(1), 63-78. doi: 10.19173/irrodl.v20i1.3789

York, T., Gibson, C., & Rankin, S. (2015). Defining and measuring academic success. Practical Assessment, Research & Evaluation, 20(5), 1-20. Retrieved from https://pareonline.net/getvn.asp?v=20&n=5

 

Athabasca University

Creative Commons License

Doctoral Students’ Learning Success in Online-Based Leadership Programs: Intersection with Technological and Relational Factors by HyunKyung Lee, Heewon Chang, and Lynette Bryan is licensed under a Creative Commons Attribution 4.0 International License.