Paul Gorsky, Avner Caspi, Avishai Antonovsky, Ina Blau, and Asmahan Mansur
Open University of Israel
The objective of this study was to determine the relationship between disciplinary difference (exact and natural sciences versus humanities) and the dialogic behavior that occurred in Open University course forums. Dialogic behavior was measured in terms of students’ and instructors’ active participation in the forum (posting a message) as well as amounts and proportions of “teaching presence,” “cognitive presence,” and “social presence.” We found that active participation in the science forums was much higher than in the humanities forums. We also found a ratio among the three presences that was constant across different academic disciplines,as well as across different group sizes and course types.
Keywords: Academic disciplines; disciplinary differences; asynchronous forums; dialogic behavior; community of inquiry model; virtual learning community
The organization of knowledge into academic disciplines and the impact of these disciplines on educational objectives and curricula, on how subject matter is taught and learned, on how academic achievement is evaluated, and on how research is carried out has been extensively reported. The goal of our research is to investigate the impact of academic discipline on the dialogic behavior of participants in Open University course forums, that is, students’ and instructors’ active participation in the forum (i.e., posting a message) as well as amounts and proportions of “teaching presence,” “cognitive presence,” and “social presence.” In order to study this relationship, we compared course forums from two broad disciplines whose differences greatly outweighed their similarities: exact and natural sciences versus humanities.
To place this study in a meaningful framework, we first discuss the nature of academic disciplines with special emphasis on the classic model proposed by Biglan (1973), who classified characteristics of subject matter in different academic areas. Second, we review research findings about educational objectives and how subject-matter is typically taught and learned in different disciplines. Finally, we present the research methodology and findings.
Discipline is defined by the Oxford English Dictionary as “a branch of learning or scholarly instruction.” Given the diversity of academic disciplines that account for much of human knowledge, it is not surprising that significant similarities and differences exist among them. We now present a classic framework for classifying disciplines.
Biglan’s research (1973) into the similarities and differences between subject matter across diverse disciplines (and the personal characteristics of those researchers who engaged in these disciplines) is considered a classic; since its publication, it has been cited extensively. Biglan found three dimensions along which disciplines may be classified:
Despite the statistical significance associated with the life : non-life dimension, it has been generally ignored. Left with a 2x2 matrix, Biglan (1973) classified disciplines into four categories. To each category, he associated disciplines and described the nature of their subject-matter. Table 1 summarizes his classification.
Biglan (1973), and others who used this classification, clearly acknowledged that some disciplines (or particular sub-disciplines within a discipline) may straddle boundaries. For example, some fields within the discipline philosophy, say logic, tend toward hard-pure while others, say epistemology, tend toward soft-pure. Furthermore, disciplines, or sub-disciplines, may, over time, migrate towards one grouping from another; for example the movement of linguistics into the hard-pure area through the increased influence of computational research. Although these groupings were made decades ago, they still serve today as useful models for carrying out empirical research.
To date, research has shown that disciplinary differences have a significant influence on the ways in which academic work is organized (Becher, 1990; 1994; Becher & Trowler, 2001; Neumann, 2001). The formal academic goals of undergraduate programs commonly take the form either of a brief description of the subject matter or of a claim to high intellectual benefits. For example, a typical literature course (humanities, soft-pure) may cite as its goal: “to introduce students to the main tenets of literary criticism” or “students will appreciate the relevance of Shakespearean drama to the modern world.” A typical mathematics course (exact science, hard-pure) may cite as its goal: “to introduce students to the fundamentals of calculus” or “students will acquire tools for analyzing partial differential equations.” These formal goals are seemingly very similar; however, when operationalized, they differ in very significant ways from one knowledge field to another.
The subject-matter characteristics of hard and soft disciplines described by Biglan (1973) in Table 1 generally correspond with particular instructional strategies. On the one hand, according to Biglan, hard subjects, both pure and applied, are generally grounded in an epistemological stance that is objective and absolute. Hard subjects are generally quantitative, based on precise measurements and widely accepted theories. Problem solving and practical skills are of high importance and priority. Hard subjects generally place a greater emphasis on mastery of content rather than on discussions. Teaching, therefore, is often didactic, based on lectures and workbooks.
On the other hand, according to Biglan (1973), soft subjects, both pure and applied, are generally grounded in an epistemological stance that is subjective and relative. They are generally qualitative and tend to place less emphasis on hierarchical knowledge foundations expressed mathematically. Discussion is a frequently employed instructional strategy.
Garrison, Anderson, and Archer (2000) developed the community of inquiry (CoI) model as an online learning research tool. The CoI model provides a comprehensive theoretical framework for research into both online learning and the practice of online instruction (Arbaugh, Bangert, & Cleveland-Innes, 2010). The model emerged in the specific context of computer conferencing in higher education, that is, asynchronous, text-based group discussions (Garrison, Anderson, & Archer, 2010). It remediated a lack of theoretical development in the field of online education and triggered a large amount of empirical studies (Akyol, Garrison, & Ozden, 2009). During the years 2000-2008, 48 studies were carried out using the CoI model (Rourke & Kanuka, 2009) and the body of research continues to grow rapidly suggesting important implications for the design of successful e-learning (Garrison, Anderson, & Archer, 2010; Shea & Bidjerano, 2009a).
The framework consists of three dimensions: cognitive presence, teaching presence, and social presence as well as categories and indicators to define each of the presences and to guide the coding of transcripts. Cognitive presence is defined by Garrison, Anderson, and Archer (2001) as the extent to which participants are able to construct meaning through sustained communication. Teaching presence includes subject matter expertise, the design and management of learning, and the facilitation of active learning (Anderson, Rourke, Garrison, & Archer, 2001). Social presence is the perceived presence of others in mediated communication (Rourke, Garrison, Anderson & Archer, 1999), which Garrison et al. (2000) contend supports both cognitive and teaching presence through its ability to instigate, to sustain, and to support interaction. It had its genesis in the work of John Dewey. This framework has provided significant insights and methodological solutions for studying online learning (Akyol et al., 2009; Garrison, Anderson, & Archer, 2010; Garrison & Archer, 2003; Garrison, Cleveland-Innes, Koole, & Kappelman, 2006). The structure of the community of inquiry model has been confirmed through factor analysis (Arbaugh, 2008; Arbaugh & Hwang, 2006; Garrison & Arbaugh, 2007; Garrison, Cleveland-Innes, & Fung, 2010; Shea & Bidjerano, 2009b; Swan et al., 2008).
Social presence is described as the ability to project one’s self and to establish personal and purposeful relationships (Rourke et al., 1999). The three main categories of social presence are affective communication, open communication, and group cohesion. Richardson and Swan (2003) explored perceptions of social presence in online courses and found that students’ perceptions of social presence were highly correlated with perceived learning and satisfaction with their instructors (see also Steinweg, Trujillo, Jeffs, & Hopfengardner-Warren, 2006). Picciano (2002) found relationships between student perceptions of social presence, learning, and interactions in the course discussions. The positive correlation between perceived social presence, seen according to the community of inquiry model as self projection, and most aspects of perceived learning may lead to the conclusion that social presence affords learning by setting a convenient climate (Caspi & Blau, 2008). Garrison, Cleveland-Innes, and Fung (2010) argued recently that perceived social presence can be seen as a mediating variable between perceived teaching presence and cognitive presence. However, actual interaction in the course discussions in Picciano's (2002) study was not correlated with actual performance (their scores on a multiple choice exam and on a written assignment). Whether and how actual social interaction might or might not affect actual learning online remains unclear (Caspi & Blau, 2008; Swan & Shea, 2005).
Several studies investigated the shift of social presence over time in online course discussions. Swan (2002) reported that open communication indicators (“affective” and “interactive”) of social presence increased over time, while cohesive indicators decreased. One possible explanation is that the use of such references became less necessary as a galvanized classroom community was formed. Another possible explanation addressed the fact that discussion was more exploratory than collaborative. Contrary to the nature of the shift in social presence reported by Swan (2002), Vaughan (2004) and Vaughan and Garrison (2006) found that the frequency of affective and open communication comments decreased, while group cohesion comments increased. It is important to note that the context of Vaughan’s study (2004) was a blended professional development community. The interpretation was that affective and open communication was necessary to establish a sense of community. It was only after the social relationships were established and the group became more focused on purposeful activities that cohesive comments began to take precedence. Social presence online becomes somewhat transparent as the focus shifts to academic purposes and activities.
Teaching presence is defined as “the design, facilitation and direction of cognitive and social processes for the purpose of realizing [students’] personally meaningful and educationally worthwhile outcomes” (Anderson et al., 2001, p.5). Vygotsky’s (1978) scaffolding analogies illustrate an assistive role for teachers in providing instructional support to students from their position of greater content knowledge. Although many authors recommend a “guide on the side” approach to moderating student discussions, a key feature of this social-cognition model is the adult, the expert, or the more skilled peer who scaffolds a novice’s learning (Anderson et al., 2001). The community of inquiry model defines three categories of teaching presence: design and organization, facilitating discourse, and direct instruction. The categories of teacher presence have been tested by Anderson et al. (2001) in the analysis of the complete transcripts of two online courses and proved both reasonably reliable and useful in identifying differences in both the quantity and quality of the teaching presence projected by different online instructors. How these differences might relate to community has not yet been hypothesized, but the Community of Inquiry model might provide a starting point for such investigations (Swan & Shea, 2005).
The body of evidence attesting to the importance of teaching presence for successful online learning is growing rapidly (Bliss & Lawrence, 2009; Garrison & Cleveland-Innes, 2005; Garrison, Cleveland-Innes, & Fung (2010); Meyer, 2003; Murphy, 2004; Pawan, Paulus, Yalcin, & Chang, 2003; Shea, Pickett, & Pelz , 2004; Swan, 2002; Swan & Shih, 2005; Varnhagen, Wilson, Krupa, Kasprzak, & Hunting, 2005; Vaughan, 2004; Wu & Hiltz, 2004). The consensus is that teaching presence is a significant determinate of perceived learning, student satisfaction, and sense of community. Perceived teaching presence had a strong direct effect on self-reported learning outcomes (LaPointe & Gunawardena, 2004). Each category of a tutor’s presence is vital to learning and the establishment of the learning community; their behavior must be such that they are seen to be “posting regularly, responding in a timely manner and modeling good online communication and interaction” (Palloff & Pratt, 2003, p.118). Without an instructor’s explicit guidance and “teaching presence,” students were found to engage primarily in “serial monologues” (Pawan et al., 2003). Baker (2004) discovered that instructor immediacy, i.e., teaching presence (Rourke et al., 1999), was a more reliable predictor of effective cognitive learning than whether students felt “close to each other,” i.e., social presence.
Studies have demonstrated that instructor participation in threaded discussion is critical to the development of social presence (Shea, Li, Swan, & Pickett, 2005; Swan & Shih, 2005), and sometimes not fully appreciated by online faculty (Liu, Bonk, Magjuka, Lee, & Su, 2005). Shea, Li, and Pickett (2006) proposed that teaching presence – viewed as the core role of the online instructor – is a promising mechanism for developing learning community in online environments. The majority of students and instructors in Vesely, Bloom, and Sherlock’s (2007) study identified the same elements for building online community, but students ranked instructor modeling as the most important element in building online community, while instructors ranked it fourth.
Cognitive presence is defined as the exploration, construction, resolution, and confirmation of understanding through collaboration and reflection in a community of inquiry (Garrison et al., 2001). Cognitive presence is grounded in the work of Dewey (1933) on reflective thinking (see Swan, Garrison, & Richardson, 2009, for further discussion). Four categories (or phases) of cognitive presence are defined: triggering event, exploration, integration, and resolution. Garrison et al. (2001) argued that the third phase, integration, is the most difficult to detect from a teaching or research perspective. This phase requires active teaching presence to diagnose misconceptions, to provide probing questions, comments, and additional information in an effort to ensure continuing cognitive development, and to model the critical thinking process. Often students will be more comfortable remaining in a continuous exploration mode; therefore, teaching presence is essential in moving the process to more advanced stages of critical thinking and cognitive development.
Recently Garrison, Cleveland-Innes, and Fung (2010) suggested that the dynamic relationships among the presences across different academic disciplines be explored. This investigation does just that in the context of differences between academic disciplines. We used the quantitative content analysis technique and data logs to analyze three-week segments from 50 forums, half from exact sciences and half from humanities. Arbaugh, Bangert, and Cleveland-Innes (2010) also studied differences between academic disciplines. They found significant differences in perceptions of social, cognitive, and teaching presence between applied and pure academic disciplines. Their study was a survey based on perceptions of the CoI framework. In this study, the quantitative content analysis technique was used. Given the reliability and validity of this procedure and that all other relevant variables in the learning environment (course policy, content and difficulty, equivalent numbers of instructor assignments, group size, semi-random assignment to groups) were controlled, we expected to identify the impact of disciplinary difference on the dialogic behavior of the representative forums. We hypothesized that for forums in the exact sciences, active participation and levels of social presence, teaching presence, and cognitive presence would be significantly higher than for forums in the humanities. These hypotheses are based on empirical findings reported by Gorsky and Caspi (Caspi, Gorsky, & Chajut, 2003; Gorsky, Caspi, &Tuvi-Arad, 2004, Gorsky, Caspi, &Trumper, 2004, 2006; Caspi & Gorsky, 2006; Gorsky, Caspi, &Smidt, 2007).
The Open University of Israel is a distance education university that offers undergraduate and graduate studies to students throughout Israel. The learning environment is blended: The University offers a learning method based on printed textbooks, face-to-face tutorials, and an online learning content management system (LCMS) wherein each course has its own website. Course sites simplify organizational procedures and enrich students’ learning opportunities and experiences. Website use is optional or non-mandatory so that equality among students is preserved. It does not replace textbooks or face-to-face tutorials, which are the pedagogical foundations of the Open University. The website provides forums for asynchronous instructor-student and student-student interactions. Each course has a coordinator, who is responsible for all administrative and academic activities, and instructors, who lead tutorials. Instructors and coordinators are available for telephone consultations at specified days and times. Course coordinators define the number of forums made available and their purpose.
We analyzed findings from 50 forums, half from the exact sciences and half from the humanities. We created two composite forums that represented each of the disciplines. To create similar composites, the 50 individual forums were closely matched by group size and course level. Participation in all forums was non-obligatory; no grades or bonuses were linked to student participation. Distributions are shown in Tables 2 and 3.
Two instruments were employed for obtaining data: (1) the course log site that recorded the messages, and (2) the quantitative content analysis technique, which was used to code and analyze transcriptions from the 50 forums. This technique has been widely used; it is reliable and valid (Garrison, 2007). Its implementation, however, requires that several methodological issues be resolved (Garrison, Anderson, & Archer, 2010).
One issue is the level of coding (e.g., indicator vs. category). Content analysis, as described by Rourke and Anderson (2004), is time-consuming, and coding at the indicator level is difficult, often yielding poor reliability (Murphy & Ciszewska-Carr, 2005). In this study, we coded at the category level (Garrison et al., 2006).
A second issue is the unit of analysis. Rourke et al. (1999) identified five units of analysis used in computer conferencing research: proposition units, sentence units, paragraph units, thematic units, and message units (Garrison, Anderson, & Archer, 2010). While there has been some discussion around this issue (Garrison et al., 2006; Fahy, 2001; Rourke, Anderson, Garrison, & Archer, 2001), it remains a challenging decision influenced by research question and context. In the present study, we used the message unit, in accord with Anderson et al.’s (2001) study of teaching presence, Garrison et al.’s (2001) study of cognitive presence, Rourke et al.’s (1999) study of social presence, as well as Akyol, Garrison, and Ozden (2009), Gorsky and Blau (2009), and Shea et al.’s (2010) studies of all three presences.
A third issue is scoring: As in Gorsky and Blau (2009), we analyzed each message and scored each of the 10 categories as either present or not present (1 or 0). In other words, if a category occurred more than once in a given message (say, two distinct occurrences of “open communication”), we recorded present only once. We did not count multiple recurrences of a category within the same message.
Other issues are objectivity, reliability, and replicability (Rourke et al., 2001). No established standards exist for inter-rater reliability (De Wever, Schellens, Valcke, & Van Keer, 2006). There is no consensus for the percent agreement statistic. Often a cut-off figure of 0.75–0.80 is used to determine reliability; others use 0.70 (Neuendorf, 2002; Rourke et al., 2001). To increase reliability and to control errors brought on by inexperience or misinterpretation, Garrison et al. (2006) suggest a negotiated coding approach: researchers code the transcripts and then actively discuss their respective codes with their fellow judges in order to achieve consensus or near consensus. Gros and Silva (2006) propose the use of a research methodology based on the intervention of the participants, especially course instructors, for analyzing computer-supported communication. In this study we used the traditional coding approach (without negotiation of disagreements or participant intervention): 25% of postings were randomly chosen and re-estimated by a second rater; 92% agreement was achieved (Cohen’s k = 0.89).
From each course forum, three-week segments were analyzed. The trial period began one month after the start of the semester in order to insure that opening messages and initial enthusiasm had waned and that the final exam was still far distant.
Table 4 shows the percentage of students who participated in the composite science forum and the composite humanities forum by posting a message.
Student participation in the composite science forum was twice as high as student participation in the composite humanities forum. We also recorded the number of messages posted by instructors and students. Findings are shown in Table 5.
Students in the composite science forum wrote about three times as many messages as did their counterparts in the humanities. This finding, however, is surely related to the fact that twice as many students participated in the composite science forum as opposed to the composite humanities forum (Table 3). Instructors in the composite science forums wrote twice as many messages as did their counterparts in the humanities. We may assume that increased student participation accounts, at least in part, for the greater number of messages posted in the science forum, both by instructors and by the students themselves. In other words, if we factor out the twofold advantage of student participation in the science forum, we see that instructors in both disciplines posted a similar number of messages per student. In the same manner, if we factor out the twofold advantage in the number of science students, the adjusted ratio is 1.5 : 1. Science students wrote about 50% more messages than their counterparts in the humanities. A significant difference was found between the distributions of student and instructor postings in the two disciplines [X2(1) = 306.1, p < .001].
We next analyzed the forums in terms of the three dimensions that comprise the community of inquiry model. Findings are shown in Table 6. Table 6 (and tables henceforth) also show the adjusted ratios that take into account the 2.08:1 numerical advantage held by students in the science forums.
Categories of all three presences were found to a greater extent in the composite science forum. Furthermore, a highly significant difference was found for distributions of teaching presence, cognitive presence, and social presence between the two composite forums, after adjusting for number of participants [X2(2) = 38.09; p < .0001]. Social presence was more prevalent in the composite humanities forum, while cognitive presence was more prevalent in the composite science forum. We next analyzed the data at the level of category. Findings are shown in Table 7.
For each of the dimensions, and for all categories except cohesion, total amounts were greater in the composite science forum. No significant differences were found within the distributions of teaching presence and cognitive presence. For social presence, significant differences were found for the categories “open communication” (higher in the science forum) and “cohesion” (higher in the humanities forum).
We next analyzed amounts of teaching presence, cognitive presence, and social presence for instructors only. Table 8 presents these data. These ratios are also adjusted because instructor postings are related to the number of students who actively participated.
Adjusted ratios for teaching and social presence show that instructors in the composite humanities forum were equally or slightly more active than their counterparts in the humanities except for the category exploration. In this category, science instructors were more active. A significant difference was found within social presence for the category “cohesion” (higher in the humanities forum). In addition, significant differences were found in cognitive presence for the categories “trigger” (higher in the humanities forum) and “exploration” (higher in the science forum).
We next analyzed amounts of teaching presence, cognitive presence, and social presence for students only. Table 9 presents these data. Ratios are adjusted to account for the twofold participation of science students.
Students in the composite science forum were far more active than their counterparts in the humanities. This is especially conspicuous for all categories in teaching presence and for three of the four categories in cognitive presence (excluding “resolution”). Regarding the distributions of the categories, a significant difference was found for social presence.
Given data from 50 forums, we carried out further calculations in order to estimate a population parameter that may characterize sample populations other than the one studied in this investigation. To begin, we calculated the average distribution of cognitive presence, teaching presence, and social presence across both disciplines. Table 10 shows the average distributions of the three presences for the entire sample. Even though there exists a highly significant difference between the distributions of the two forums, we tested for significant differences between each of the forums and the average distribution. No significant differences were found. In other words, assuming the possible existence of a distribution that characterizes the population (19.62 : 18.63 : 61.75), neither forum differed from it significantly. Indeed, such findings may indicate the presence of a bimodal distribution.
If such a distribution of cognitive presence, teaching presence, and social presence is representative of the particular population investigated in this study, it must also manifest itself in a variety of situations, not just in disciplinary differences. To further test the robustness of the proposed population parameter, we calculated the distributions of the three presences across group size (see Table 3). Findings are shown in Table 11. The chi square column tests for significant differences between each of the distributions and the proposed parameter. No significant differences were found.
To even further test the robustness of the estimated parameter, we calculated the distributions of the three presences across course type: introductory, regular, and advanced. Findings are shown in Table 12. No significant differences were found between the proportions of the three presences and course type.
Findings showed clearly the impact of academic discipline on the dialogic behavior of participants in Open University course forums. We hypothesized that for forums in the exact sciences, active participation and levels of social presence, teaching presence, and cognitive presence would be significantly higher than for forums in the humanities. Findings clearly support both hypotheses. We will frame the discussion in terms of answering three basic questions that emerge from the hypotheses:
Increased student participation in the composite science forum may be associated with the nature of the discipline. Science courses have a relatively large number of tutor assignments based on problem-solving. Mandatory problems need to be solved and the forum is a useful resource for interpersonal student-instructor and student-student dialogue. Evaluation in humanities courses, at least among those found in the Open University of Israel, tend to have fewer tutor assignments (2-4) and these assignments are not based on solving problems.
Asynchronous forums are a resource that supports interpersonal dialogue (Gorsky & Caspi, 2005; Gorsky, Caspi, & Chajut, 2008). Regarding the utilization of this and similar resources for interpersonal dialogue (telephone, e-mail, etc.), it has been shown that students use such resources either when they experience difficulty in understanding subject-matter or when they are unable to solve problems (Caspi & Gorsky, 2006; Gorsky, Caspi, & Smidt, 2007; Gorsky, Caspi, & Trumper, 2004; Gorsky, Caspi, & Trumper, 2006; Gorsky, Caspi, & Tuvi-Arad, 2004). Given subject-matter difficulty and tutor assignments based overwhelmingly on problem solving, it seems reasonable that students in the sciences utilized interpersonal dialogue to much higher extents than did their counterparts in the humanities.
We found significant differences between the categories of cognitive presence and social presence for instructors in the two disciplines. Within the dimension social presence, the category “open communication” was found to a much higher degree among instructors in the science forum. The centrality of problem solving may have been the catalyst that provoked such behavior since social presence includes such indicators as asking questions, referring to or quoting from others’ messages, expressing agreement, and even simply continuing a thread rather than starting a new one. Also within the dimension social presence, the category "cohesion" was found to a much higher degree among instructors in the humanities forum. Given lackluster participation in the humanities forum, instructors may have tried to create a sense of group cohesion and to establish a more positive climate by addressing participants by name, using greetings and closures, and addressing the group as “we,” “our,” and “us,” in order to encourage and to promote participation.
A highly significant disciplinary difference was noted for instructors’ cognitive presence. On the one hand, adjusted ratios show (Table 7) that humanities instructors posted three times more messages associated with the category “trigger” than their counterparts in the sciences. This would indicate an attempt by humanities instructors to trigger and to encourage discussion in their forums. Indeed, a cursory review of the humanities forum showed that many questions posted by instructors remained unanswered. On the other hand, science instructors posted four times as many messages associated with the category “exploration” than their counterparts in the humanities. This may possibly indicate science instructors’ increased participation in the problem solving process, alone or together with their students.
We found significant differences for all categories of cognitive presence, teaching presence, and social presence for students in the two disciplines. The most profound example is teaching presence, which may give a positive answer to the question recently posted by Garrison, Cleveland-Innes, and Fung (2010): Does teaching presence through design, facilitation, and direct instruction account for apparent disciplinary differences? Very high levels of teaching presence among the science students, however it manifested itself, were, for all practical intents and purposes, non-existent in the composite humanities forum. According to Vygotsky (1978), attempts to solve problems through social interaction and assistance from more competent peers promote students’ learning abilities in their zones of proximal development (ZPD). Resulting from enhanced peer teaching presence, three of the four categories of cognitive presence were also more abundant in the composite science forum (a grand total of only three instances of “resolution” were recorded).
Findings point to the intriguing possibility that the estimated population parameter for the distribution of cognitive presence, teaching presence, and social presence found in this study (19.62 : 18.63 : 61.75) also exists for asynchronous communities of inquiry in wider contexts. Currently, these findings were obtained in undergraduate, asynchronous course forums at the Open University of Israel, analyzed by one of several possible analytic procedures using the community of inquiry model. This relationship may (or may not) exist in other sample populations and settings. Only further research will supply an answer. The data we found must be replicable in other communities of inquiry characterized by course idiosyncrasies such as obligatory participation, in different universities, and in other cultures.
In order to begin the search for replicability, we referred to a previous study that utilized an identical analytic procedure (Gorsky & Blau, 2009). We found the proportions of each presence in a graduate level education course (discipline: soft-applied, as opposed to soft-pure) over an entire semester. Table 13 displays these findings alongside those from this study.
There is no significant difference between the distributions from the graduate level education course and the proposed population parameter [X2(2 )= 1.067, p = 0.59]. Furthermore, as it should be, this distribution is nearly identical to that of the typical humanities (discipline: soft-pure) forum [X2(2) = 0.31, p = .85]. Finally, assuming a bimodal distribution vis-à-vis disciplinary difference, it differs significantly from the composite science forum [X2(2) = 12.82, p < .001].
Assuming the existence of a population parameter (19.62 : 18.63 : 61.75), we now investigate its relationship with the distributions obtained for each of the individual course forums; that is, to what extent did they correspond with the estimated parameter? Specifically, we calculated the standard deviation of the mean value for the magnitude of social presence in the composite science and humanities forums. Confidence intervals are shown in Table 14.
Fourteen of the 25 forums in the humanities discipline lie with the 95% confidence interval; six of the 25 forums in the exact and natural science disciplines lie with the 95% confidence interval.
We have to remember that what we observe is not nature herself, but nature exposed to our method of questioning. (Heisenberg, 1958)
We reiterate that all findings from this study were obtained by using a particular scoring procedure (see Instruments and Procedure). Given the use of this procedure, we found highly significant relationships between academic discipline and dialogic behavior in Open University course forums. We also estimated a population parameter for the distribution of the three presences in asynchronous communities of inquiry. On the one hand, given the established reliability and validity of this particular procedure, these findings are more than mere artifacts. On the other hand, given the diversity of approaches to content analysis, these findings need further corroboration using different approaches and procedures.
Akyol, Z., Arbaugh, J. B., Cleveland-Innes, M., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. (2009). A response to the review of the community of inquiry framework. Journal of Distance Education, 23, 123-136.
Akyol, Z., Garrison, D. R., & Ozden, M. Y. (2009). Online and blended communities of inquiry: Exploring the developmental and perceptional differences. The International Review of Research in Open and Distance Learning, 10, 65-83.
Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5. Retrieved from http://sloan-c.org/publications/jaln/v5n2/pdf/v5n2_anderson.pdf
Arbaugh, J. B. (2008). Does the community of inquiry framework predict outcomes in online MBA courses? The International Review of Research in Open and Distance Learning, 9, 1-12. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/490/1048
Arbaugh, J. B., Bangert, A., & Cleveland-Innes, M. (2010). Subject matter effects and the community of inquiry (CoI) framework: An exploratory study. The Internet and Higher Education, 13, 37-44.
Arbaugh, J. B., & Hwang, A. (2006). Does "teaching presence" exist in online MBA courses? The Internet and Higher Education 9, 9-21.
Becher, T. (1990). Academic tribes and territories. Milton Keynes: Open University Press.
Becher, T. (1994). The significance of disciplinary differences, Studies in Higher Education, 19, 151-61.
Becher, T., & Trowler, P. R. (2001). Academic tribes and territories: Intellectual enquiry and the cultures of disciplines (2nd ed.). Buckingham, UK: Open University Press.
Biglan, A. (1973). The characteristics of subject matter in different academic areas. Journal of Applied Psychology, 57, 195-203.
Bliss, C. A., & Lawrence, B. (2009). From posts to patterns: A metric to characterize discussion board activity in online courses. JALN, 13. 1-18. Retrieved from http://www.cems.uvm.edu/~cbliss/Discussion_Paper_Bliss-Lawrence-2008.pdf
Caspi, A., & Blau, I. (2008). Online discussion groups: The relationship between social presence and perceived learning. Social Psychology of Education, 11, 323-346.
Caspi, A., & Gorsky, P. (2006). Open university students’ use of dialogue. Studies in Higher Education, 31, 735-752.
Caspi, A., Gorsky P., & Chajut, E. (2003). The influence of group size on non-mandatory asynchronous instructional discussion groups. The Internet and Higher Education, 6, 227-240.
De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: A review. Computers & Education, 46, 6-28.
Dewey, J. (1933). How we think (Rev. ed.). Boston: D.C. Heath.
Fahy, P. J. (2001). Addressing some common problems in transcript analysis. International Review of Research in Open and Distance Learning, 1. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/321/530
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education 2, 87-105.
Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking and computer conferencing: A model and tool to assess cognitive presence. American Journal of Distance Education, 15, 7-23.
Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13, 5-9.
Garrison, D.R., & Arbaugh, J.B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. Internet and Higher Education, 10, 157–172.
Garrison, D. R., & Archer, W. (2003). A community of inquiry framework for online learning. In M. Moore (Ed.), Handbook of distance education. New York: Erlbaum.
Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education 19, 133-148.
Garrison, D. R., Cleveland-Innes, M., Koole, M., & Kappelman, J. (2006). Revisiting methodological issues in the analysis of transcripts: Negotiated coding and reliability. The Internet and Higher Education, 9, 1-8.
Garrison, D. R., Cleveland-Innes, M., & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework. The Internet and Higher Education, 13, 31-36.
Gorsky, P., & Blau, I. (2009). Online teaching effectiveness: A tale of two instructors. International Review of Research on Distance Learning, 10, 1-27.
Gorsky, P., Caspi, A., & Tuvi-Arad I. (2004). Use of instructional dialogue by university students in a distance education chemistry course. Journal of Distance Education, 19, 1-19.
Gorsky, P., Caspi, A., & Trumper, R. (2004). Dialogue in a distance education physics course. Open Learning: The Journal of Open and Distance Learning, 19, 265-277.
Gorsky, P., Caspi, A., & Trumper, R. (2006). Campus-based university students’ use of dialogue. Studies in Higher Education, 31, 71-87.
Gorsky, P., Caspi, A., & Smidt, S. (2007). Use of instructional dialogue by university students in a difficult distance education physics course. Journal of Distance Education, 22, 1-22.
Gros, B., & Silva, J. (2006). El problema del análisis de las discusiones asincrónicas en el aprendizaje colaborativo mediado. Revista de Educación a Distancia- RED, 16. Retrieved from http://www.um.es/ead/red/16/gros.pdf
Heisenberg, W. (1958). Physics and philosophy: The revolution in modern science. Lectures delivered at University of St. Andrews, Scotland, Winter, 1955-56. NY: Harper and Row.
LaPointe, D. K., & Gunawardena, C. N. (2004). Developing, testing and refining of a model to understand the relationship between peer interaction and learning outcomes in computer-mediated conferencing. Distance Education, 25, 83–106.
Liu, X., Bonk, C. J., Magjuka, R. C., Lee, S., & Su, B. (2005). Exploring four dimensions of online instructor roles: A program level case study. Journal of Asynchronous Learning Networks, 9, 29-48.
Meyer, K. A. (2003). Face-to-face versus threaded discussions: The role of time and higher-order thinking. Journal of Asynchronous Learning Networks, 7, 55-65.
Murphy, E. (2004). Recognizing and promoting collaboration in an online asynchronous discussion. British Journal of Educational Technology, 35, 421-431.
Murphy, E., & Ciszewska-Carr, J. (2005). Sources of difference in reliability: Identifying sources of difference in reliability in content analysis of online asynchronous discussions. International Review of Research in Open and Distance Learning, 6. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/233/855
Neuendorf, K. A. (2002). The content analysis guidebook. Thousand Oaks, CA: Sage Publications.
Neumann, R. (2001). Disciplinary differences and university teaching, Studies in Higher Education, 26, 135-146.
Palloff, R., & Pratt, K. (2003). The virtual student: A profile and guide to working with online learners. San Francisco, CA: Jossey-Bass Inc.
Pawan, F. T., Paulus, M., Yalcin, S., & Chang, C. (2003). Online learning: Patterns of engagement and interaction among in-service teachers. Language Learning & Technology, 7, 119-140.
Picciano, A.G. (2002). Beyond student perceptions: Issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning Networks, 6. Retrieved from http://www.aln.org/publications/jaln/v6n1/pdf/v6n1_picciano.pdf
Richardson, J.C., & Swan, K. (2003). Examining social presence in online courses in relation to students' perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7. Retrieved from http://www.aln.org/publications/jaln/v7n1/pdf/v7n1_richardson.pdf
Rourke, L., & Anderson, T. (2004). Validity issues in quantitative computer conference transcript analysis. Educational Technology Research and Development, 52, 5-18.
Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (1999). Assessing social presence in asynchronous test-based computer conferencing. Journal of Distance Education 14, 50-71. Available at http://www.jofde.ca/index.php/jde/article/viewArticle/153/341
Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2001). Methodological issues in the content analysis of computer conference transcripts. International Journal of Artificial Intelligence in Education 12, 8-22.
Rourke, L., & Kanuka, H. (2009). Learning in communities of inquiry: A review of the literature. Journal of Distance Education, 23, 19-48.
Shea, P., & Bidjerano, T. (2009a). Cognitive presence and online learner engagement: A cluster analysis of the community of inquiry framework. Journal of Computing in Higher Education, 21, 199-217.
Shea, P., & Bidjerano, T. (2009b). Community of inquiry as a theoretical framework to foster "epostemic engagement" and "cognitive presence" in online education. Computers & Education, 52, 543-553.
Shea, P., Hayes, S., Vickers, J., Gozza-Cohen, M., Uzuner, S., Mehta, R., Valchova, A., & Rangan, P. (2010). A re-examination of the community of inquiry framework: Social network and content analysis. The Internet and Higher Education, 13, 10-21.
Shea, P., Li, C. S., & Pickett, A. M. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. The Internet and Higher Education, 9, 175−190.
Shea, P., Li, C. S., Swan, K., & Pickett, A. M. (2005). Developing learning community in online asynchronous college courses: the role of teaching presence. Journal of Asynchronous Learning Networks, 9. Retrieved from
http://www.sloan-org/publications/jaln/v9n4/pdf/v9n4_shea.pdf
Shea, P. J., Pickett, A. M., & Pelz, W. E. (2004). Enhancing student satisfaction through faculty development: The importance of teaching presence. In J. Bourne & J. C. Moore (Eds.), Elements of quality online education: Into the mainstream, Volume 5 (pp.39-59). Needham, MA: Sloan-C.
Steinweg, S. B., Trujillo, L., Jeffs, T., & Hopfengardner-Warren, S. (2006). Maintaining the personal touch in a growing program: Strategies for establishing social presence in online classes. Journal of the Research Center for Educational Technology, 2. Retrieved from http://www.rcetj.org/?type=art&id=79598&
Swan, K. (2002). Building communities in online courses: The importance of interaction. Education, Communication and Information, 2, 23-49.
Swan, K., Garrison, D.R., & Richardson, J. (2009). A constructivist approach to online learning: The community of inquiry framework. In C.R. Payne (Ed.), Information technology and constructivism in higher education: Progressive learning frameworks. Hershey, PA: IGI Global.
Swan, K., & Shea, P. (2005). The development of virtual learning communities. In S.R. Hiltz & R. Goldman (Eds.), Asynchronous learning networks: The research frontier (pp. 239-260). New York: Hampton Press.
Swan, K., Shea, P., Richardson, J., Ice, P., Garrison, D. R., Cleveland-Innes, M., & Arbaugh, J. B. (2008). Validating a measurement tool of presence in online communities of inquiry. E-Mentor, 2, 1-12.
Swan, K, & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks 9. Retrieved from http://www.sloan-c.org/publications/JALN/v9n3/v9n3_swan.asp
Varnhagen, S., Wilson, D., Krupa, E., Kasprzak, S., & Hunting, V. (2005). Comparison of student experiences with different online graduate courses in health promotion. Canadian Journal of Learning and Technology, 31, 99-117.
Vaughan, N. (2004). Investigating how a blended learning approach can support an inquiry process within a faculty learning community (Doctoral dissertation). University of Calgary. Retrieved from http://www.ucalgary.ca/~nvaughan/norm/nvaughandissertation.pdf
Vaughan, N., & Garrison, D. R. (2006). How blended learning can support a faculty development community of inquiry. Journal of Asynchronous Learning Networks 10. Retrieved from http://www.sloan-c.org/publications/JALN/v10n4/v10n4_vaughan.asp
Vesely, P., Bloom, L., & Sherlock, J. (2007). Key elements of building online community: Comparing faculty and student perceptions. MERLOT Journal of Online Learning and Teaching, 3. Retrieved from http://jolt.merlot.org/vol3no3/vesely.pdf
Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Cambridge MA: Harvard University Press.
Wu, D., & Hiltz, S. R. (2004). Predicting learning from asynchronous online discussions. Journal of Asynchronous Learning Networks, 8, 139-152.