Julie Shattuck
Frederick Community College, USA
Bobbi Dubins
Allegany College of Maryland, USA
Diana Zilberman
Baltimore City Community College, USA
This article reports on an inter-institutional project to design, develop, pilot, and evaluate a state-wide online training course for higher education adjunct faculty who are preparing to teach their first online course. We begin with a brief literature review to contextualize the stated problem the project sought to address: the need for quality, accessible training for online adjunct faculty. We then give background information to describe the environment in which the project was situated before detailing the process of designing and piloting the first iteration of the Certificate for Online Adjunct Teaching (COAT) course. Using a mixed-methods approach (surveys and reflection journals), data were collected from the adjunct faculty who took the COAT course, the COAT instructor, and the COAT design team. The results indicate that the pilot COAT course did meet the perceived needs and expectations of the course participants. We finish by discussing our plans for the next phase of this project.
Keywords: Adjunct faculty; online teaching; professional development; online learning
Research has highlighted that different roles and competencies are needed for online teaching than for traditional, on-campus instruction (Berge, 1995; Goodyear, Salmon, Spector, Steeples, & Tickner, 2001; Ragan, 2009; Smith, 2005; Varvel, 2007). Using Berge’s (1995) four instructor roles for moderating online discussions, Morris and Finnegan (2008–2009) found that novice online instructors “enacted a management role to a limited degree, and rarely posted a comment classified as ‘pedagogical’” (p. 61); however, experienced online instructors "enacted multiple roles – social, managerial, and pedagogical – to engage students and increase student persistence and success” (p. 61). To assist novice online instructors in becoming competent in all four of Berge’s online roles, higher education institutions may offer some form of training in online teaching. However, this training may not be available to all instructors, particularly part-time, adjunct faculty who have limited access to on-campus training opportunities, and the training may not be sufficient to adequately prepare instructors to effectively teach online.
This article focuses on an ongoing project in Maryland, United States, which began in 2008 when MarylandOnline (MOL), a statewide consortium of higher education institutions, funded an exploratory research project to see if there was interest in a shared training program to prepare adjunct faculty to teach online. The research indicated that there was a need for such a program, and this article focuses on describing the second phase of the project: the development, delivery, and evaluation of a pilot Certificate for Online Adjunct Teaching (COAT) course. The article begins with a brief literature review to contextualize the stated problem the project seeks to address: the need for quality, accessible training for online adjunct faculty. We then give background information to describe the environment in which the project is situated before detailing the process of designing and piloting the first iteration of the COAT course. Evaluation data from the pilot course are presented and analyzed before we discuss our recommendations for future iterations of the COAT course.
A recent report focused on online learning in the United States found that “online enrollments have continued to grow at rates far in excess of the total higher education student population” (Allen & Seaman, 2010, p. 1), and comparative enrollment trends for community colleges from fall 2007 to fall 2008 “reported a 22% increase for distance education enrollments” (Instructional Technology Council, 2010, p. 2). Tipple (2010) highlighted that this increase in online enrollment is inter-related with a second trend: “the significant increase in adjunct (part-time) faculty” (para. 1). The Center for Community College Student Engagement (2009) found that 67% of all community college instructors taught part-time (p. 18), and Seaman (2009), surveying instructors employed at four-year institutions in the United States, discovered that “part-time faculty are more likely to engage in online learning than their full-time counterparts, with 32.4% of part-time faculty currently teaching online compared to 22.2% of full-timers” (p. 15).
Kanuka, Jugdev, Heller, and West’s (2008) exploratory study focusing on academics who worked from home (of which 66.5% were adjunct faculty) concluded that new instructors should be provided with “an option for sustained early training in distance-delivered online teaching” (p. 162), and that such training should be delivered online. However, recent studies suggest that professional development opportunities focusing on helping instructors become familiar with online teaching roles and competencies may not be available for all instructors. For example, Allen and Seaman (2010) found that “19% of institutions with online offerings report that they have no training or mentoring programs for their online teaching faculty” (p. 3). Pagliari, Batts, and McFadden’s (2009) research into desired versus actual training for online instructors showed that over 40% of surveyed online instructors had not accessed any training in the past year.
For institutions that do provide training for instructors transitioning to online teaching, the training may not be offered in a format that is easily accessible for adjunct faculty. Allen and Seaman (2010) found that “the most common training approaches…are internally run training courses (65%) and informal mentoring (59%)” (p. 3), but details were not provided on the structure or format of the trainings. It is likely that adjunct and full-time faculty training needs vary, with online adjuncts less able to attend on-campus workshops or participate in mentoring if it occurs through informal face-to-face meetings.
Two recent doctoral dissertations have focused on the training needs of online adjunct faculty. Biro (2005) conducted qualitative research that explored, among other topics, online adjuncts’ perceptions of their preparation to teach online. Biro concluded that “instructional teams comprised of faculty, administrators, technologists, and instructional design specialists work best when helping faculty prepare to teach online” (2005, p. 90), and that this team-based training “must encourage and facilitate critical-thinking opportunities for faculty who teach online so they can reflect on their decisions as educators and on their learning as students” (2005, p. 93). Blodgett (2008) performed an exploratory, descriptive study of adjuncts’ professional development experiences and preferences to prepare them to teach online. Blodgett’s study addressed “the lack of information regarding professional development of part-time/adjunct faculty in preparation for online teaching from the perspective of such faculty” (2008, p. 7). Her research found that adjuncts’ perceived needs and preferences for training included (a) the use of online formats to provide flexible access, (b) the provision of the experience of being online students, and (c) the offer of mentoring for continued support. Blodgett gave three recommendations based on her research findings, the first being that “universities should develop formalized, yet flexible faculty development programs for adjunct faculty who are hired to teach online courses” (2008, p. 88).
To summarize, with the increase in online enrollments and the number of adjuncts teaching online courses comes a need for quality training that is accessible to adjunct faculty. Recent research recommends that this training should be designed by teams of faculty, administrators, instructional designers, and technologists, and that the training should be offered in an online format that gives instructors the experience of being online students. The next section discusses how the problem contextualized in this brief literature review—namely, the need for quality, accessible training for new online adjunct faculty—is being addressed within a specific context: higher education institutions in Maryland.
This section begins by introducing two groups involved in online learning in Maryland: MarylandOnline (MOL) and the Instructional Design Affinity Group (IDAG). IDAG received MOL grants in 2008 and 2009 to initiate the COAT project. The COAT initiative has been a collaborative project involving a number of individuals from both MOL and IDAG. A list of major project contributors can be found on the COAT Web site (COAT Project, 2010a).
MarylandOnline is a consortium of independently governed higher education institutions in Maryland. MOL’s mission states that it is
a statewide, inter-segmental consortium, dedicated to championing distance learning in Maryland. Through collaboration among Maryland community colleges, colleges, and universities, MarylandOnline facilitates students’ access to articulated courses, certificates, and degree programs offered via distance; and promotes excellence in Web-based learning in the physical as well as in the virtual classroom. With strategic partners, MarylandOnline enhances the quality and availability of higher education for the citizens and employers of Maryland and for students worldwide. (MarylandOnline, 2010)
MarylandOnline was established in 1999 and is considered by its member institutions to be innovative and progressive in its approach to championing the cause of distance education. This was reinforced in 2003 by the awarding of a U.S. Department of Education Fund for the Improvement of Postsecondary Education (FIPSE) grant. The chief goal of the grant was the development of a “replicable pathway for inter-institutional quality assurance and course improvements in online learning” (Quality Matters, 2010, para. 1). The product of the FIPSE grant, Quality Matters, has since become nationally recognized for its faculty peer review certification process for online courses.
IDAG, an affinity group of the Maryland Distance Learning Association, is primarily comprised of instructional designers working in higher education contexts. The stated mission of IDAG is to promote “the use of instructional design for learning activities that are mediated by technology” (Instructional Design Affinity Group, 2010, para. 1). IDAG’s goals include supporting Maryland distance learning programs and fostering partnerships through collaboration. The project described in this paper came about through IDAG collaboration focused on perceived training needs for preparing instructors to teach online.
In 2008, IDAG applied for a grant from MOL in order to conduct research on how interested Maryland institutions might be in the development of a state-wide online teaching certificate for instructors in higher education. As instructional designers at Maryland institutions, many IDAG members were responsible for providing training for instructors at their institutions in the areas of pedagogy and technologies used for online teaching. It became apparent that many instructional designers within the group were developing similar training sessions for instructors at their respective institutions. It was felt that the creation of a training course that could be shared among institutions might reduce this duplication of effort while also expanding the number, quality, and consistency of trainings offered to online instructors within Maryland’s higher education community.
Many institutions were also grappling with the task of how to properly prepare new instructors how to teach online. With the success of MOL’s Quality Matters project (Quality Matters, 2010) and its impact on defining and certifying the quality of course design, institutions were turning their attention to the quality of the delivery of those courses.
Instructors themselves seemed interested in obtaining some type of formal designation indicating they had a certain level of online teaching expertise. Adjunct instructors, who often teach for multiple institutions, were sometimes required to complete potentially identical training at each institution. In contrast, some adjunct instructors did not have access to training at all because their institution did not offer it or did not offer it in a format or time frame that was convenient for them. It was envisioned that the creation of a sharable training course would increase the availability of training to instructors. It could also potentially increase the pool of trained adjunct faculty for institutions to draw upon. Hence, it was envisioned that the project could benefit MOL member institutions through (a) providing access to COAT course design and training materials, and (b) providing access to a pool of trained instructors. The project could benefit adjunct instructors through (a) providing access to training that is familiar to MOL institutions, (b) providing a proven method to document their skills, and (c) offering access to training that might not currently be available or easily accessible to them.
MOL responded to IDAG’s grant request by awarding an initial grant to the group in the fall of 2008. Primary purposes of the grant were identified as (a) to perform research on the training needs of Maryland’s higher educational institutions, (b) to perform research on the level of interest Maryland’s higher educational institutions may have in a shared training course/program, and (c) if there appeared to be sufficient interest, to recommend a program model(s) that might allow MOL to offer training sessions or certification courses as a state-wide group. The group first reviewed current literature on online teaching competencies and researched existing higher education training programs for online teaching (Dubins & Graham, 2009).
A survey was then conducted on the training needs of Maryland’s higher educational institutions. The Maryland Higher Education Commission (MHEC) Web site (MHEC, n.d.) was used to identify higher education institutions in Maryland. Thirty-seven institutions were identified as having credit online course offerings or programs in place, and invitations to participate in the survey were sent via email to the distance learning administrators and instructional designers/faculty trainers of these 37 institutions. Multiple responses from institutions were permitted in order to collect more comprehensive data (i.e., the researchers saw a need for data collected from both administrator and instructional designer/faculty trainer perspectives). Respondents were required to identify themselves in order to detect duplicate responses from institutions.
The survey gathered information about faculty training/professional development sessions offered by institutions to their online instructors. Information gathered included (a) topics/competencies covered, (b) delivery mode, (c) identification of unmet training needs, and (d) reasons why unmet training needs were not being addressed. Finally, the survey included questions designed to gauge interest in training offered by a central Maryland organization and interest in a state-wide certification program for online instructors.
The survey response rate was 59% with a total of 27 responses received from 22 institutions (five institutions provided responses from two different respondents). The majority of responses were received from distance learning directors/managers (13 responses) and instructional designers/technologists/faculty trainers (13 responses).
Selected results of the Maryland Faculty Training Needs Assessment Survey were as follows:
The results of the survey indicated there was supported interest by Maryland’s higher education distance learning professionals to develop a state-wide training program focused on the competencies needed to teach online. The survey results also revealed which training topics institutions were currently offering (see Table 1) and which topics they felt needed to be offered but which were not currently available to their faculty. Responses to an open-ended question on what training they would like to offer, but currently did not, fell into the following categories: (a) teaching online (six responses); (b) pedagogy (two responses); (c) assessment (four responses); (d) managing online discussions (one response); (e) Americans with Disabilities Act (two responses); (f) copyright (two responses); (g) course design (two responses); and (h) technology (two responses).
The topics identified in Table 1, as well as the results of the literature review to identify online teaching competencies, were detailed in the report to MOL at the end of phase one. The report included recommendations that (a) the training should be delivered fully online, include formal assessment of core competencies, and focus on teaching online, not on course design; (b) an advisory board comprised of experienced online instructors, instructional designers, and distance learning administrators should be formed; and (c) the training should be available to both new and experienced instructors. The report also recommended course competencies that were incorporated into the COAT syllabus in phase two (COAT Project, 2010c).
The phase one project report recommended that MOL fund a second phase of the project focused on the development and pilot offering of a training course aimed at preparing adjunct faculty to teach online. This section focuses on phase two of this project, which was completed in the academic year 2009–2010. The logistics of setting up an inter-institutional training course is first discussed.
Preparation for phase two of the project necessitated first identifying major project tasks and determining a timeline for project activities. These activities reflected the main components needed in an instructional design plan as identified by Morrison, Ross, and Kemp (2007): learner characteristics, task analysis, instructional objectives, content sequencing, instructional strategies, designing the message, development of instructions, and evaluation instruments (p. 12).
August–September 2009:
October–November 2009:
December 2009–March 2010:
April–June 2010:
The first task in designing the COAT course was to decide on the course structure and write the syllabus. Using the recommendations from phase one’s research, it was decided to deliver the course completely online as a nine-week asynchronous course consisting of four modules. The modules encompassed the eight main competency areas: (a) orienting students to online learning; (b) technology skills; (c) learning management skills; (d) basic instructional design principles; (e) pedagogy and andragogy; (f) social process and presence; (g) managing assessment; and (h) legal and institution-specific policies and procedures (COAT Project, 2010b). The course description reflected elements from the community of inquiry framework (Garrison, Anderson, & Archer, 2000) with a particular emphasis on social and teaching presences.
A primary objective for the paced COAT course structure was to provide instructors with the experience of online learning from the student's perspective. The concept of a group training experience led by an instructor, as opposed to self-paced study with no instructor, drew on Bandura’s (1977) social learning theory, in particular on the idea of modeling. By participating in a well-designed online course facilitated by an experienced online instructor who modeled identified best practices, participants would benefit through observing the practical implementation of what they studied in the course.
The course syllabus gave a detailed course description, including teaching methods, learning objectives, and assessment methods (COAT Project, 2010c). Course design standards provided to the design team indicated that the course should include structured weekly content similar to what instructors would likely use in their own online courses, such as (a) using a textbook, articles, and Web sites as required readings; (b) viewing videos; (c) completing written and interactive exercises; (d) completing quizzes, self-checks, and self-reflection assignments; and (e) interacting with other participants in discussion boards and group activities.
The next task was to address any concerns of distance learning administrators at MOL-affiliated institutions. An advisory board was formed in August, 2009 and included representatives from a number of MOL-affiliated institutions and organizations. The advisory board initially focused on addressing areas of concern that had been expressed by some institutions. The first area of concern was the use of the word recognize in the proposed grant proposal for phase two: colleges would recognize the training. The project management team clarified that the intent of the project was to offer training with content that was familiar to MOL participating institutions. It was not to mandate the training to institutions or to require institutions to formally recognize it. Individual institutions were free to determine whether the training met or contributed to their training needs, and to what extent.
The second area of concern was the title of the project (i.e., Certificate for Online Adjunct Teaching). Some institutions were uncomfortable with using the word certificate in the title of the project, citing concerns that participants might misinterpret it to be a professional certificate or a credit-course certificate program. The advisory board was not able to come to consensus on this issue prior to the drafting of the phase two proposal to MOL, so it was agreed that the title of the project for phase two would be modified to the Online Adjunct Teaching project and that the group would revisit the title of the project upon completion of phase two.
The third area of concern was the target audience for the course. It was clarified that the course would be targeted toward adjunct instructors who were experienced face-to-face college teachers, but were new to teaching online.
Using the input provided by the advisory board, the project management team presented a detailed proposal for phase two of the project to the MOL board in September, 2009, which was subsequently approved. The advisory board was active throughout phase two of the project and offered input and advice on various facets of the project, including recommendations for continuing the project into phase three, with a project title of Certificate for Online Adjunct Teaching.
In addition to the project advisory board, the project management team solicited input from the distance learning directors affinity group, which was comprised of directors of distance learning at MOL member institutions (or who had similar responsibilities). The course was showcased to the group in June 2010. Feedback regarding the course was excellent.
The COAT course was developed using a collaborative, inter-institutional team approach. Preparation for course design and development began in the fall of 2009 with the recruitment and selection of the course development team. The course development team included members from six Maryland institutions who were experienced online instructors (full-time and adjunct), instructional designers, and/or distance learning administrators. All members had extensive experience in instructional design and were well-versed in the Quality Matters course design standards.
With the exception of an initial team meeting, the team met and designed the course entirely online using Internet conferencing and collaboration tools. The team met on a weekly basis over a period of four months.
All team members were employed in positions at their respective colleges; thus, it was essential to maximize efficiency and effectiveness of the team. The initial team meeting was held face-to-face in November 2009, at which time the project leaders outlined the major tasks of the design team and the project timeline. They also shared the roles they envisioned for each member of the team and gave each team member an opportunity to accept, decline, or modify their role and/or time commitment. Hence, at the conclusion of the initial meeting, each member had a clear idea of what was expected and was enthusiastic about his or her role on the team.
Also during the initial meeting, the project leaders distributed copies of the course syllabus and module objectives, which they had determined using the research conducted in phase one of the project and which were supported by the advisory board (COAT Project, 2010c). In addition, proposed course development standards were introduced in order to ensure coherence across course content and adherence to good instructional design standards and practices. Design and development of the course occurred from January through April 2010. Course design highlights included that the course
A design team survey, conducted at the conclusion of the course development, indicated that the team unanimously felt the inter-institutional, team approach to designing the course resulted in a course of much higher quality than one being designed by a one- or two-person team. They felt the team collaboration allowed for a more diverse pool of ideas, as well as a diverse pool of knowledge (i.e., each team member brought a different strength to the project). In addition, the inter-institutional approach to designing and developing the course resulted in a more comprehensive coverage of topics and issues that adjunct instructors from different institutions might encounter. Despite a heavy workload and unforeseen external demands on some team members, the team unanimously indicated they found the experience to be rewarding and were proud of the course they had produced. In addition, all team members noted that they felt they were given adequate license to be creative and innovative.
Participants were recruited for the pilot course through (a) a COAT presentation at the 2009 Maryland Consortium for Adjunct Professional Development conference, (b) referrals from distance learning administrators, and (c) referrals from MOL board members. Of the 65 applicants for the online pilot course, 20 were chosen. Criteria for selection included (a) experienced adjuncts with no previous online teaching experience, (b) availability during the pilot course period of April through June 2010, (c) affiliation with an MOL member institution, and (d) teaching discipline. The 20 participants represented 10 Maryland institutions. Two of the participants withdrew from the course within the first week, citing personal reasons for their withdrawal (lack of sufficient time, lack of technical skills). Of the remaining 18 participants, 17 completed the course successfully. The pilot course was offered at no cost to participants.
The purpose for evaluating the pilot course was to focus on how the participants and instructor perceived the effectiveness of the course content and design for preparing adjuncts to teach their first online course. When participants applied to take the course, they were informed that they would be asked to provide feedback on their experiences in the course, specifically on how the course could be improved for future participants. Participants were asked to give permission to use their course contributions (submitted assignments, discussion board postings, survey responses, etc.) for evaluation purposes. Participants were assured that their contributions would be presented anonymously and their evaluation comments would have no impact on their successful completion of the course. All participants voluntarily signed a permission form.
The evaluation approach was based within a social constructivist epistemology as defined by Koro-Ljungberg, Yendol-Hoppey, Smith, and Hayes (2009). Koro-Ljungberg et al.’s description of a social constructivist epistemology included the following: considering the researcher as having a multifaceted, participatory role; having research goals to “negotiate and transform the practice” (2009, p. 690); and viewing knowledge as being generated from participants. The researcher who conducted and analyzed the evaluations was a member of the COAT leadership team, but was not a member of the course design team. The aim of the research was to use course participants’ feedback to make changes to the pilot course where necessary in order to improve the course for future offerings.
Evaluation data were collected from the participants in the pilot course using a mixed-methods approach: surveys (four module surveys and an end-of-course survey) and course documents (e.g., reflection journals). The surveys contained both Likert scale questions and open-ended questions in order to provide both quantitative and qualitative data.
Participants were asked to complete an online survey within the learning management system at the end of each module and an additional survey at the end of the course (a total of five surveys ranging from 17 to 34 questions in length). Participants were assured that their responses were anonymous and no response could be directly linked with a participant’s name. Out of 17 participants, 16 completed the end-of-course and module 1 surveys, 15 completed the modules 2 and 4 surveys, and 14 completed the module 3 survey. The researcher tabulated the Likert scale questions and categorized the open-ended responses into common topics. Other members of the COAT team were asked to review the categories and make comments on whether the categories reflected the data in a way that would inform useful course redesign decisions. In the interest of space, only partial results of the surveys are given. Tables 2 and 4 show the compiled results for the four module surveys’ closed response questions, Table 3 gives the results to the Likert scale questions in the end-of-course survey, and Table 5 shows the results from one of the open-ended response questions for the first module.
The data presented in Table 2 show that the majority of participants indicated that they either strongly agreed or agreed that the course content was clearly stated (97% of responses), the assignments and activities were clearly explained (88%), and the content useful (95%). Highlights from the end-of-course survey responses were that the majority of participants strongly agreed or agreed that
In addition, the majority of the participants strongly disagreed or disagreed that they would have preferred to access the training via four separate modules (10 out of 16 respondents), or as self-paced individual study (12 out of 16 respondents). In response to one of the open-ended questions, “what did you like most about the course,” seven responses included having the experience of being an online student. For example, one participant stated, “I liked most that I got to experience it as a student.” These results confirmed for the COAT team the value of having cohort-based, paced online training that positioned participants as online students.
One interesting development that the design team had not originally planned was the inclusion of two optional synchronous meetings using web-conferencing software. The course was designed as a completely asynchronous course, but the instructor suggested offering optional synchronous meetings. These meetings were well received by the attendees, with five participants strongly agreeing or agreeing that the synchronous meetings were useful.
The pilot course was designed with the expectation that participants would spend approximately four to five hours a week working on the course. Table 4 shows that module 2 had the heaviest workload in terms of how many hours participants felt they worked on course content (9 out of 15 respondents felt they worked seven or more hours a week on this module). In the end-of-course survey, 12 out of 16 respondents strongly agreed or agreed that the amount of content covered each week was reasonable, but 9 out of 16 strongly agreed or agreed that they found it challenging to keep up with the workload. A number of responses to the open-ended question on what participants liked least about the course suggested that the workload expectations could be revisited: “Sometimes having to read all of the discussion threads seemed overwhelming,” and “a few of the weeks were challenging with the amount of perceived work.” The course design team had been concerned about the amount of work in module 2, and so made module 2 a three-week rather than a two-week module (modules 1, 3, and 4 were two-week modules).
The open-ended questions were categorized into topics in an attempt to see if there were any clear patterns to the participants’ responses. On the whole, it was found that the data collected in the open-ended questions, while very useful in painting a picture of individuals’ experiences in the course, were not helpful for making course redesign decisions. For example, in Table 5, while two respondents found the SoftChalk lesson on instructional design basics useful, another participant highlighted the same lesson as not being useful. This was a trend throughout the open-ended questions responses: what one participant liked, another did not. The COAT team decided that the qualitative data gathered in the pilot course should be combined with data gathered in the next phase of the COAT project to see if increasing the sample size produces clearer, more distinct categories to inform major redesign decisions for future iterations of the training course.
The course was facilitated by a faculty member who had extensive experience teaching online, both at the undergraduate and graduate level. The pilot course instructor was part of the COAT course design team, so she was familiar with the course design and the rationale behind design decisions. Despite her involvement in the design of the course, the instructor acknowledged that
as an instructor who taught both online and in the face-to-face format for many years, I found myself holding back the desire to tweak the look and feel of various course pages or alter its content even in the slightest degree. The urge to make changes in the course was stemming from not having experienced teaching a class where the learning materials were the result of a group effort. Rather, like many other community college instructors, I was used to crafting my own content.
Instead of implementing changes to the course design as she taught, the instructor kept a personal journal where she noted her thoughts regarding the possible revisions for the following iterations of the course, as well as ideas for how various assignments or topics could be changed, added, or deleted. For example, since Google docs were used for the group project, the instructor suggested in her feedback to the project team that students could use wikis, which could be better assessed.
The instructor reported that with robust and logically organized content, the teaching of the course became a daily enjoyment, also coupled by participants’ enthusiasm and interest. Moreover, the exchange of perspectives on education and ideas coming from a diverse group of instructors provided a learning experience for all participants. She noted that the most remarkable fact, however, was the degree to which her own understanding of the distinction between the role of course design and the online teacher’s roles play in students’ satisfaction and success. As one previously involved in the Quality Matters program, the instructor was aware of these distinct, yet overlapping components. Nevertheless, it was only after she taught the COAT course that the roles of the instructor were crystallized. For one, she realized once more that the most important roles of the online instructor were to set out the tone for communicating online and to serve as a guide. She embraced both roles and noticed participants’ positive response to prompt and encouraging feedback.
At the beginning of the course, the instructor conducted a synchronous orientation session in the form of a webinar for interested participants. From the 20 adjunct faculty who were selected for the course, 8 took part in the web-conferencing orientation session, which covered topics such as course expectations and navigating the learning management system. The need for a second synchronous session appeared when participants requested a demonstration of how to add audio content to their courses. Both webinars were very well received by participants who attended.
The COAT pilot course evaluations indicated that, at the end of the course, the majority of participants found that the course (a) met their needs to prepare them to teach online, (b) modeled good course design and teaching practices, and (c) presented content in a way that met their preferred learning style. However, most participants reported that although the amount of content covered each week was reasonable, it was still challenging to keep up with the workload. The qualitative data painted a rich picture of individuals’ experiences in the course but was not cohesive enough to use for major redesign decisions. As a result, the project leadership decided that more data needed to be collected before determining if any major changes to the course were needed. However, a few minor changes were recommended for the next iteration of the course:
The phase two report was presented to MOL in July 2010, at which time it was recommended that a phase three be implemented in the academic year 2010–2011. Phase three recommendations included a goal of becoming grant-independent (i.e., financially self-sustaining). To accomplish this, there would be a fee for taking the course. The fee was set at $300 for adjunct faculty living or teaching in Maryland and $600 for all others. These fees were estimated to cover the administrative and instructor costs of running three COAT course sections in 2010–2011.
The primary goal of phase three is to determine if there is sufficient demand for the course at the recommended pricing structure in order for the project to become self-sustaining. To achieve this goal, COAT courses are scheduled for the fall, spring, and summer semesters in 2010–2011. The course will be offered with the same design as the pilot course with the few minor exceptions noted previously: (a) adding optional synchronous session(s), (b) reducing/realigning the workload, and (c) making some assignments optional or ungraded. Participants will again be asked to complete course evaluations. At the end of phase three, the evaluation data for the phase three course offerings will be combined with the phase two data to create a larger sample size. It is hoped that this will provide sufficient data to determine if major design changes are needed. At the end of phase three, the COAT team should also have sufficient data to determine whether the COAT project could be self-sustaining. The COAT team hopes that research conducted in phase three will lead to recommending a phase four of the project in 2011–2012 with the expansion of the number of COAT course offerings and a continuing cycle of evaluation and course improvement. It is anticipated that research in phase four will utilize additional data collection tools in order to address the limitations of this current research study, which focused on the perceptions of a small sample of participants.
Allen, E. I., & Seaman, J. (2010). Learning on demand: Online education in the United States, 2009. Needham, MA: Sloan Consortium. Retrieved from http://www.sloanconsortium.org/publications/survey/pdf/learningondemand.pdf
Bandura, A. (1977). Social learning theory. Englewood Cliffs, NJ: Prentice-Hall.
Berge, Z. (1995). Facilitating computer conferencing: Recommendations from the field. Educational Technology, 35(1), 22-30.
Biro, S. C. (2005). Adjunct faculty perceptions about their preparation, support, and value as online instructors. Retrieved from ProQuest Dissertations & Theses: Full Text. (UMI No. 3255630).
Blodgett, M. (2008). Adjunct faculty perceptions of needs in preparation to teach online. Retrieved from ProQuest Dissertations & Theses: Full Text. (Publication No. AAT 3311265).
Center for Community College Student Engagement. (2009). Making connections: Dimensions of student engagement (2009 CCSSE findings). Austin, TX: The University of Texas at Austin, Community College Leadership Program. Retrieved from http://www.ccsse.org/publications/national_report_2009/CCSSE09_nationalreport.pdf
COAT Project. (2010a). Acknowledgements for the COAT training project. Retrieved from http://marylandonline.org/coat/acknowledgements.htm
COAT Project. (2010b). COAT course competencies. Retrieved from http://www.marylandonline.org/coat/documents/COAT_Course_Competencies.pdf
COAT Project. (2010c). Syllabus. Retrieved from http://marylandonline.org/coat/documents/COAT_syllabus_webpage.pdf
Dubins, B. H., & Graham, M. B. (2009, August). Training instructors to teach online: Research on competencies/best practices. Paper presented at the 25th Annual Conference on Distance Teaching and Learning, Madison, WI. Retrieved from http://www.uwex.edu/disted/conference/Resource_library/proceedings/09_20433.pdf
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105. doi:10.1016/S1096-7516(00)00016-6
Goodyear, P., Salmon, G., Spector, J. M., Steeples, C., & Tickner, S. (2001). Competencies for online teaching: A special report. Educational Technology Research and Development, 49(1), 65-72. doi:10.1007/BF02504508
Instructional Design Affinity Group. (2010). IDAG instructional design affinity group. Retrieved from http://www.marylanddla.org/idag/
Instructional Technology Council. (2010). Trends in eLearning: Tracking the impact of eLearning at community colleges. Retrieved from http://www.itcnetwork.org/file.php?file=%2F1%2FITCAnnualSurvey2009Results.pdf
Kanuka, H., Jugdev, K., Heller, R., & West, D. (2008). The rise of the teleworker: False promises and responsive solutions. Higher Education, 56(2), 149-165. doi:10.1007/s10734-007-9095-z
Koro-Ljungberg, M., Yendol-Hoppey, D., Smith, J. J., & Hayes, S. B. (2009). (E)pistemological awareness, instantiation of methods, and uninformed methodological ambiguity in qualitative research projects. Educational Researcher, 38(9), 687-699. doi:10.3102/0013189X09351980
MarylandOnline. (2010). Vision and mission. Retrieved from http://marylandonline.org/about/vision-and-mission
Maryland Higher Education Commission. (n.d.). Colleges and universities. Retrieved from
http://www.mhec.state.md.us/higherEd/colleges_universities/index.asp
Morris, L. V., & Finnegan, C. L. (2008–2009). Best practices in predicting and encouraging student persistence and achievement online. Journal of College Student Retention: Research, Theory and Practice, 10(1), 55-64. doi:10.2190/CS.10.1.e
Morrison, G. R., Ross, S., & Kemp, J. E. (2007). Designing effective instruction (5th ed.). New York: John Wiley and Sons.
Pagliari, L., Batts, D., & McFadden, C. (2009). Desired versus actual training for online instructors in community colleges. Online Journal of Distance Learning Administration, 12(4). Retrieved from http://www.westga.edu/~distance/ojdla/
Quality Matters. (2010). FIPSE grant project. Retrieved from http://www.qmprogram.org/research-grants/fipse
Ragan, L. (2009). Defining competencies for online teaching success. Distance Education Report, 13(19), 3-6. Retrieved from http://www.magnapubs.com/distanceeducation/
Seaman, J. (2009). Online learning as a strategic asset. Volume II: The paradox of faculty voices. Washington, DC: Association of Public and Land-grant Universities. Retrieved from http://www.sloanconsortium.org/sites/default/files/APLU_online_strategic_asset_vol2-1.pdf
Smith, T. C. (2005). Fifty-one competencies for online instruction. The Journal of Educators Online, 2(2). Retrieved from http://www.thejeo.com/%20
Tipple, R. (2010). Effective leadership of online adjunct faculty. Online Journal of Distance Learning Administration, 13(1). Retrieved from http://www.westga.edu/~distance/ojdla/
Varvel, V. E., Jr. (2007). Master online teacher competencies. Online Journal of Distance Learning Administration, 10(1). Retrieved from http://www.westga.edu/~distance/ojdla/