Using the Critical Incident Questionnaire as a Formative Evaluation Tool to Inform Online Course Design: A Qualitative Study

Authors

  • Anita Samuel Uniformed Services University of Health Sciences, Bethesda, Maryland, USA
  • Dr. Simone C.O. Conceição University of Wisconsin-Milwaukee, Milwaukee, Wisconsin, USA

DOI:

https://doi.org/10.19173/irrodl.v23i2.5959

Keywords:

student evaluation of teaching, Critical Incident Questionnaire, online course design, formative assessment

Abstract

The online instructor plays a prominent role in influencing how students respond to an online course, from designing the course structure, course activities, and assignments to encouraging interaction. Therefore, to develop effective online courses, instructors need robust feedback on their design strategies. Student evaluation of teaching (SET) functions as a summative evaluation of the course design and delivery. Yet, the feedback from SETs can only be integrated into the next iteration of the course, thereby failing to benefit the students who provide the feedback. One suggestion is to use midsemester formative evaluation to inform course design in real time. A qualitative research study was conducted to explore whether the Critical Incident Questionnaire (CIQ) could be an effective formative evaluative tool to inform real-time online course design and delivery. Thematic analysis was conducted on the midcourse evaluations obtained from 70 students in six fully online master’s level courses. There are three key findings from this study. First, CIQ use can provide opportunities for real-time adjustments to online course design and inform future redesign of online courses. Second, responses received via the CIQ prioritize the student voice and experience by focusing on factors that are critical to them. Finally, this deep-dive analysis reinforces the enduring factors that contribute to effective online course design and delivery. A recommendation for practice is to use the CIQ as an effective tool to gather formative feedback from students. This feedback can then be used to adjust course design as needed.

Author Biographies

Anita Samuel, Uniformed Services University of Health Sciences, Bethesda, Maryland, USA

Anita Samuel is Assistant Director of Distance Learning and Assistant Professor with the School of Medicine at the Uniformed Services University of Health Sciences.

She earned her doctorate in Urban Education with a specialization in Administrative Leadership in Adult, Continuing, and Higher Education, and her areas of research are online education and faculty development. She strongly believes that student success is dependent on educator satisfaction and is passionate about the welfare of educators. She has won various awards for her work including the 2019 Innovation in Education Award at USU and the Graduate Student Research Award for her work on faculty experiences in the online environment.

Dr. Simone C.O. Conceição, University of Wisconsin-Milwaukee, Milwaukee, Wisconsin, USA

Simone C. O. Conceição is professor emerita at the University of Wisconsin-Milwaukee and owner of SCOC Consulting, a consulting firm that helps educators, designers, and administrators understand and apply global strategic thinking, intentional learning design, curriculum development, instructor training, and program evaluation.

References

Anderson, T. (2011). The theory and practice of online learning (5th ed.). Athabasca University Press. https://auspace.athabascau.ca/bitstream/handle/2149/411/?sequence=1

Baldwin, S. J., & Trespalacios, J. (2017). Evaluation instruments and good practices in online education. Online Learning, 21(2). https://olj.onlinelearningconsortium.org/index.php/olj/article/view/913

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. https://doi.org/10.3102/0034654309333844

Blackboard. (2017, April 10). Blackboard exemplary course program rubric. https://www.blackboard.com/resources/are-your-courses-exemplary

Boring, A., Ottoboni, K., & Stark, P. (2016, January 7). Student evaluations of teaching (mostly) do not measure teaching effectiveness. ScienceOpen Research, 1–11. https://www.scienceopen.com/document/read?vid=818d8ec0-5908-47d8-86b4-5dc38f04b23e

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Brookfield, S. (1998). Critically reflective practice. Journal of Continuing Education in the Health Professions, 18(4), 197–205. https://doi.org/10.1002/chp.1340180402

California State University. (2019). CSU QLT course review instrument. https://docs.google.com/document/d/1ilqtDHjYfuJfjq1f8lG_bGhH9Xskcad-2a8aaPmnHG8/edit

Cohen, P. A. (1980). Effectiveness of student-rating feedback for improving college instruction: A meta-analysis of findings. Research in higher education, 13(4), 321-341.

Crews, T. B., Wilkinson, K., & Neill, J. K. (2015). Principles for good practice in undergraduate education: Effective online course design to assist students’ success. Journal of Online Learning and Teaching, 11(1), 87–103. https://uscdmc.sc.edu/about/offices_and_divisions/cte/instructional_design/docs/principles_good_practice_undergraduate_education_crews.pdf

Dancer, D., & Kamvounias, P. (2005). Student involvement in assessment: A project designed to assess class participation fairly and reliably. Assessment and Evaluation in Higher Education 30(4), 445–454. https://doi.org/10.1080/02602930500099235

Erikson, M., Erikson, M. G., & Punzi, E. (2016). Student responses to a reflexive course evaluation. Reflective Practice, 17(6), 663–675. https://doi.org/10.1080/14623943.2016.1206877

Finelli, C. J., Ott, M., Gottfried, A. C., Hershock, C., O’Neal, C., & Kaplan, M. (2008). Utilizing instructional consultations to enhance the teaching performance of engineering faculty. Journal of Engineering Education, 97(4), 397–411. https://doi.org/10.1002/j.2168-9830.2008.tb00989.x

Fisher, R., & Miller, D. (2008). Responding to student expectations: A partnership approach to course evaluation. Assessment & Evaluation in Higher Education, 33(2), 191–202. https://doi.org/10.1080/02602930701292514

Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2, 87–105. https://doi.org/10.1016/S1096-7516(00)00016-6

Gehringer, E. (2010, June 20–23). Daily course evaluation with Google forms [Paper presentation]. 2010 American Society for Engineering Education Annual Conference & Exposition, Louisville, KY, United States. https://doi.org/10.18260/1-2--16350

Glowacki-Dudka, M., & Barnett, N. (2007). Connecting critical reflection and group development in online adult education classrooms. International Journal of Teaching and Learning in Higher Education, 19(1), 43–52. https://eric.ed.gov/?id=EJ901286

Harasim, L. (2017). Learning theory and online technologies. Taylor & Francis. https://doi.org/10.4324/9781315716831

Hessler, H. B., & Taggart, A. R. (2011). What’s stalling learning? Using a formative assessment tool to address critical incidents in class. International Journal for the Scholarship of Teaching and Learning, 5(1), Article 9. https://doi.org/10.20429/ijsotl.2011.050109

Hurney, C., Harris, N., Bates Prins, S., & Kruck, S. E. (2014). The impact of a learner-centered, mid-semester course evaluation on students. The Journal of Faculty Development, 28(3), 55–62. https://cetl.uni.edu/sites/default/files/impact_of_learner-centered_by_hurney_1.pdf

Jacobs, M. A. (2015). By their pupils they’ll be taught: Using Critical Incident Questionnaire as feedback. Journal of Invitational Theory and Practice, 21, 9–22. https://eric.ed.gov/?id=EJ1163006

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers & Education, 95, 270–284. https://doi.org/10.1016/j.compedu.2016.01.014

Keefer, J. M. (2009). The Critical Incident Questionnaire (CIQ): From research to practice and back again. In R. L. Lawrence (Ed.), Proceedings of the 50th Annual Adult Education Research Conference (pp. 177–182). https://digitalcommons.nl.edu/cgi/viewcontent.cgi?referer=https://scholar.google.com/&httpsredir=1&article=1000&context=ace_aerc#page=191

Kyei-Blankson, L., Ntuli, E., & Donnelly, H. (2019). Establishing the importance of interaction and presence to student learning in online environments. Journal of Interactive Learning Research, 30(4), 539–560. https://www.learntechlib.org/p/161956/

Linstrum, K. S., Ballard, G., & Shelby, T. (2012). Formative evaluation: Using the Critical Incident Questionnaire in a graduate counseling course on cultural diversity. Journal of Intercultural Disciplines, 10, 94–102. https://www.proquest.com/docview/1033777245

Martin, F., & Bolliger, D. U. (2018). Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learning, 22(1), 205–222. https://doi.org/10.24059/olj.v22i1.1092

Martin, F., Ritzhaupt, A., Kumar, S., & Budhrani, K. (2019). Award-winning faculty online teaching practices: Course design, assessment and evaluation, and facilitation. The Internet and Higher Education, 42, 34–43. https://doi.org/10.1016/j.iheduc.2019.04.001

McKeachie, W. J., Lin, Y.-G., & Mann, W. (1971). Student ratings of teacher effectiveness: Validity studies. American Educational Research Journal, 8(3), 435–445. https://doi.org/10.3102%2F00028312008003435

Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics, 51(3), 648–652. https://doi.org/10.1017/S104909651800001X

Moore, M. (1997). Theory of transactional distance. In D. Keegan(Ed.) Theoretical principles of distance education (pp.22-38) Routledge. http://www.c3l.uni-oldenburg.de/cde/found/moore93.pdf

Moore, M. G. (2013). Handbook of distance education. Routledge.

Phelan, L. (2012). Interrogating students’ perceptions of their online learning experiences with Brookfield’s Critical Incident Questionnaire. Distance Education, 33(1), 31–44. https://doi.org/10.1080/01587919.2012.667958

Picciano, A. G. (2017). Theories and frameworks for online education: Seeking an integrated model. Online Learning, 21(3). https://doi.org/10.24059/olj.v21i3.1225

Quality Matters. (2018). Specific review standards from the QM higher education rubric, sixth edition. https://www.qualitymatters.org/qa-resources/rubric-standards/higher-ed-rubric

Rao, V., & Woolcock, M. (2003). Integrating qualitative and quantitative approaches in program evaluation. In F. Bourguignon, & L. A. Pereira da Silva, (Eds.), The impact of economic policies on poverty and income distribution: Evaluation techniques and tools (pp. 165–190). World Bank and Oxford University Press.

Reid, L. D. (2010). The role of perceived race and gender in the evaluation of college teaching on RateMyProfessors.Com. Journal of Diversity in higher Education, 3(3), 137–152. https://doi.org/10.1037/a0019865

Rodin, M., & Rodin, B. (1973). Student evaluations of teachers. The Journal of Economic Education, 5(1), 5–9. https://www.jstor.org/stable/1734252

Samuel, A. (2020). Zones of agency: Understanding online faculty experiences of presence. The International Review of Research in Open and Distributed Learning, 21(4), 79–95. https://doi.org/10.19173/irrodl.v21i4.4905

Stark, P., & Freishtat, R. (2014, September 29). An evaluation of course evaluations. ScienceOpen Research, 1–7. https://doi.org/10.14293/S2199-1006.1.SOR-EDU.AOFRQA.v1

State University of New York. (2018). The SUNY online course quality review rubric: OSCQR. https://oscqr.suny.edu/

Steyn, C., Davies, C., & Sambo, A. (2019). Eliciting student feedback for course development: The application of a qualitative course evaluation tool among business research students. Assessment & Evaluation in Higher Education, 44(1), 11–24. https://doi.org/10.1080/02602938.2018.1466266

Subramanian, K., & Budhrani, K. (2020, February). Influence of course design on student engagement and motivation in an online course. In SIGCSE ’20: Proceedings of the 51st ACM Technical Symposium on Computer Science Education (pp. 303–308). Association for Computing Machinery. https://dl.acm.org/doi/abs/10.1145/3328778.3366828

Swan, K. (2002). Building learning communities in online courses: The importance of interaction. Education, Communication & Information, 2(1), 23–49. https://doi.org/10.1080/1463631022000005016

Uttl, B., White, C. A., & Gonzalez, D. W. (2017). Meta-analysis of faculty’s teaching effectiveness: Student evaluation of teaching ratings and student learning are not related. Studies in Educational Evaluation, 54, 22–42. https://doi.org/10.1016/j.stueduc.2016.08.007

Veeck, A., O’Reilly, K., MacMillan, A., & Yu, H. (2016). The use of collaborative midterm student evaluations to provide actionable results. Journal of Marketing Education, 38(3), 157–169. http://doi.org/10.1177/0273475315619652

Wolbring, T., & Treischl, E. (2016). Selection bias in students’ evaluation of teaching. Research in Higher Education, 57(1), 51-71. https://link.springer.com/content/pdf/10.1007/s11162-015-9378-7.pdf

Published

2022-05-02

How to Cite

Samuel, A., & Conceição, S. (2022). Using the Critical Incident Questionnaire as a Formative Evaluation Tool to Inform Online Course Design: A Qualitative Study. The International Review of Research in Open and Distributed Learning, 23(2), 151–169. https://doi.org/10.19173/irrodl.v23i2.5959

Issue

Section

Research Articles

Publication Facts

Metric
This article
Other articles
Peer reviewers 
2
2.4

Reviewer profiles  N/A

Author statements

Author statements
This article
Other articles
Data availability 
N/A
16%
External funding 
No
32%
Competing interests 
N/A
11%
Metric
This journal
Other journals
Articles accepted 
86%
33%
Days to publication 
299
145

Indexed in

Editor & editorial board
profiles
Academic society 
N/A
Publisher 
Athabasca University Press