What If It’s All an Illusion? To What Extent Can We Rely on Self-Reported Data in Open, Online, and Distance Education Systems?

Authors

DOI:

https://doi.org/10.19173/irrodl.v24i3.7321

Keywords:

open and distance learning, higher education, self-report, inconsistent responding, learning analytics

Abstract

Online surveys are widely used in social science research as well as in empirical studies of open, online, and distance education. However, students’ responses are likely to be at odds with their actual behavior. In this context, we examined the discrepancies between self-reported use and actual use (i.e., learning analytics data) among 20,646 students in an open, online, and distance education system. The ratio of consistent responses to each of the 11 questions ranged from 43% to 70%, and the actual access to learning resources was significantly lower than self-reported use. In other words, students over-reported their use of learning resources. Females were more likely to be consistent in their responses. Frequency of visits to the open, online, and distance education system, grade point average, self-reported satisfaction, and age were positively correlated with consistency; students’ current semester was negatively correlated with consistency. Although consistency was not maintained between actual use and self-reported use, consistency was maintained between some of the self-report questionnaires (i.e., use vs. satisfaction). The findings suggested that system and performance data should be considered in addition to self-reported data in order to draw more robust conclusions about the accountability of open, online, and distance education systems.

Author Biographies

Yavuz Akbulut, Anadolu University

Yavuz Akbulut is a professor in the Faculty of Education at Anadolu University, Turkey. He holds an MA in computer assisted language learning and a PhD in instructional design and technology. His research interests include cyberpsychology and learning, multitasking, and online gaming behaviours.

Abdullah Saykılı, Anadolu University

Abdullah Saykılı is a researcher in the Department of Distance Education, Open Education Faculty at Anadolu University, Turkey. He holds MA and PhD degrees in distance education. Dr. Saykılı’s research interests include open and distance learning, learning analytics, educational data mining and interaction in online learning environments.

Aylin Öztürk, Anadolu University

Aylin Öztürk is a researcher in the Department of Distance Education, Open Education Faculty at Anadolu University, Turkey. She holds MA and PhD degrees in distance education. Dr. Öztürk’s research interests include open and distance learning, learning analytics, educational data mining, artificial intelligence, and machine learning.

Aras Bozkurt, Anadolu University

Aras Bozkurt is a researcher and faculty member in the Department of Distance Education, Open Education Faculty at Anadolu University, Turkey. He holds MA and PhD degrees in distance education. Dr. Bozkurt conducts empirical studies on distance education, open and distance learning, online learning, networked learning, and educational technology to which he applies various critical theories, such as connectivism, rhizomatic learning, and heutagogy. He is also interested in emerging research paradigms, including social network analysis, sentiment analysis, and data mining.

 

References

Akbulut, Y. (2015). Predictors of inconsistent responding in web surveys. Internet Research, 25(1), 131–147. https://doi.org/10.1108/IntR-01-2014-0017

Akbulut, Y., Dönmez, O., & Dursun, Ö. Ö. (2017). Cyberloafing and social desirability bias among students and employees. Computers in Human Behavior, 72, 87–95. https://doi.org/10.1016/j.chb.2017.02.043

Alqurashi, E. (2019). Predicting student satisfaction and perceived learning within online learning environments. Distance Education, 40(1), 133–148. https://doi.org/10.1080/01587919.2018.1553562

Azevedo, R. (2015). Defining and measuring engagement and learning in science: Conceptual, theoretical, methodological, and analytical issues. Educational Psychologist, 50(1), 84–94. https://doi.org/10.1080/00461520.2015.1004069

Baltar, F., & Brunet, I. (2012). Social research 2.0: Virtual snowball sampling method using Facebook. Internet Research, 22(1), 57–74. https://doi.org/10.1108/10662241211199960

Bandura, A. (1977). Social learning theory. Prentice Hall.

Borgonovi, F., Ferrara, A., & Piacentini, M. (2023). From asking to observing. Behavioural measures of socio-emotional and motivational skills in large-scale assessments. Social Science Research, 112, 102874. https://doi.org/10.1016/j.ssresearch.2023.102874

Bozkurt, A., Akgun-Özbek, E., Onrat-Yılmazer, S., Erdoğdu, E., Uçar, H., Güler, E., Sezgin, S., Karadeniz, A., Sen, N., Göksel-Canbek, N., Dinçer, G. D., Arı, S., & Aydın, C. H. (2015). Trends in distance education research: A content analysis of journals 2009–2013. International Review of Research in Open and Distributed Learning, 16(1), 330–363. http://dx.doi.org/10.19173/irrodl.v16i1.1953

Castro, R. (2013). Inconsistent respondents and sensitive questions. Field Methods, 25(3), 283–298. https://doi.org/10.1177/1525822x12466988

Chao, C. M. (2019). Factors determining the behavioral intention to use mobile learning: An application and extension of the UTAUT model. Frontiers in Psychology, 10, 1652. https://doi.org/10.3389/fpsyg.2019.01652

Chen, P. H. (2010). Item order effects on attitude measures (Publication No. 778) [Doctoral dissertation, University of Denver]. Electronic Theses and Dissertations. https://digitalcommons.du.edu/etd/778

Chesney, T., & Penny, K. (2013). The impact of repeated lying on survey results. SAGE Open, 3(1), 1–9. https://doi.org/10.1177/2158244012472345

DeSimone, J. A., Harms, P. D., & DeSimone, A. J. (2015). Best practice recommendations for data screening. Journal of Organizational Behavior, 36, 171–181. https://doi.org/10.1002/job.1962

DeSimone, J. A., & Harms, P. D. (2018). Dirty data: The effects of screening respondents who provide low-quality data in survey research. Journal of Business and Psychology, 33, 559–577. https://doi.org/10.1007/s10869-017-9514-9

Dönmez, O., & Akbulut, Y. (2016). Siber zorbalık çalışmalarında sosyal beğenirlik etmeni [Social desirability bias in cyberbullying research]. Eğitim Teknolojisi Kuram ve Uygulama, 6(2), 1–18. https://dergipark.org.tr/tr/pub/etku/issue/24420/258838

Ellis, R. A., Han, F., & Pardo, A. (2017). Improving learning analytics—Combining observational and self-report data on student learning. Journal of Educational Technology & Society, 20(3), 158–169. https://www.jstor.org/stable/26196127

Evans, J. R., & Mathur, A. (2005). The value of online surveys. Internet Research, 15(2), 195–219. https://doi.org/10.1108/10662240510590360

Gasevic, D., Jovanovic, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10

Gregori, A., & Baltar, F. (2013). ‘Ready to complete the survey on Facebook’: Web 2.0 as a research tool in business studies. International Journal of Market Research, 55(1), 131–148. https://doi.org/10.2501/ijmr-2013-010

Grieve, R., & Elliott, J. (2013). Cyberfaking: I can, so I will? Intentions to fake in online psychological testing. Cyberpsychology, Behavior, and Social Networking, 16(5), 364–369. https://doi.org/10.1089/cyber.2012.0271

Hsieh, P., Acee, T., Chung, W., Hsieh, Y., Kim, H., Thomas, G., Levin, J. R., & Robinson, D. H. (2005). Is educational intervention research on the decline? Journal of Educational Psychology, 97(4), 523–529. https://doi.org/10.1037/0022-0663.97.4.523

Huang, J. L., Liu, M., & Bowling, N. A. (2015). Insufficient effort responding: Examining an insidious confound in survey data. Journal of Applied Psychology, 100(3), 828–845. https://doi.org/10.1037/a0038510

Iaconelli, R., & Wolters, C. A. (2020). Insufficient effort responding in surveys assessing self-regulated learning: Nuisance or fatal flaw? Frontline Learning Research, 8(3), 104–125. https://doi.org/10.14786/flr.v8i3.521

Kara Aydemir, A. G., & Can, G. (2019). Educational technology research trends in Turkey from a critical perspective: An analysis of postgraduate theses. British Journal of Educational Technology, 50(3), 1087–1103. https://doi.org/10.1111/bjet.12780

Keusch, F. (2013). The role of topic interest and topic salience in online panel web surveys. International Journal of Market Research, 55(1), 59–80. https://doi.org/10.2501/ijmr-2013-007

Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213–236. https://doi.org/10.1002/acp.2350050305

Küçük, S., Aydemir, M., Yildirim, G., Arpacik, O., & Goktas, Y. (2013). Educational technology research trends in Turkey from 1990 to 2011. Computers & Education, 68, 42–50. https://doi.org/10.1016/j.compedu.2013.04.016

Maniaci, M. R., & Rogge, R. D. (2014). Caring about carelessness: Participant inattention and its effects on research. Journal of Research in Personality, 48, 61–83. https://doi.org/10.1016/j.jrp.2013.09.008

Mishra, P., Koehler, M. J., & Kereluik, K. (2009). Looking back to the future of educational technology. TechTrends, 53(5), 48–53. https://doi.org/10.1007/s11528-009-0325-3

Pekrun, R. (2020). Commentary: Self-report is indispensable to assess students’ learning. Frontline Learning Research, 8(3), 185–193. https://doi.org/10.14786/flr.v8i3.637

Pelletier, K., Brown, M., Brooks, D. C., McCormack, M., Reeves, J., Arbino, N., Bozkurt, A., Crawford, S., Czerniewicz, L., Gibson, R., Linder, K., Mason, J., & Mondelli, V. (2021). 2021 EDUCAUSE Horizon report teaching and learning edition. EDUCAUSE. https://www.learntechlib.org/p/219489/

Reeves, T. C., & Lin, L. (2020). The research we have is not the research we need. Educational Technology Research and Development, 68, 1991–2001. https://doi.org/10.1007/s11423-020-09811-3

Rhode, J. (2009). Interaction equivalency in self-paced online learning environments: An exploration of learner preferences. The International Review of Research in Open and Distributed Learning, 10(1). https://doi.org/10.19173/irrodl.v10i1.603

Rosen, J. A., Porter, S. R., & Rogers, J. (2017). Understanding student self-reports of academic performance and course-taking behavior. AERA Open, 3(2). https://doi.org/10.1177/2332858417711427

Ross, S. M., & Morrison, G. R. (2008). Research on instructional strategies. In M. Spector, M. D. Merrill, J. V. Merrienboer, & M. Driscoll (Eds.). Handbook of research on educational communications and technology (3rd ed., pp. 719–730). Routledge. https://doi.org/10.1007/978-1-4614-3185-5_3

Schneider, S., May, M., & Stone, A. A. (2018). Careless responding in Internet-based quality of life assessments. Quality of Life Research, 27, 1077–1088. https://doi.org/10.1007/s11136-017-1767-2

Selwyn, N. (2020). Re-imagining ‘learning analytics’… a case for starting again? The Internet and Higher Education, 46, 100745. https://doi.org/10.1016/j.iheduc.2020.100745

Siddiq, F., & Scherer, R. (2019). Is there a gender gap? A meta-analysis of the gender differences in students’ ICT literacy. Educational Research Review, 27, 205–217. https://doi.org/10.1016/j.edurev.2019.03.007

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education, 51(1), 318–336. https://doi.org/10.1016/j.compedu.2007.05.009

Steger, D., Jankowsky, K., Schroeders, U., & Wilhelm, O. (2022). The road to hell is paved with good intentions: How common practices in scale construction hurt validity. Assessment. https://doi.org/10.1177/10731911221124846

Watson, S. L., Watson, W. R., Yu, J. H., Alamri, H., & Mueller, C. (2017). Learner profiles of attitudinal learning in a MOOC: An explanatory sequential mixed methods study. Computers & Education, 114, 274–285. https://doi.org/10.1016/j.compedu.2017.07.005

Wilson, A., Watson, C., Thompson, T. L., Drew, V., & Doyle, S. (2017). Learning analytics: Challenges and limitations. Teaching in Higher Education, 22(8), 991–1007. https://doi.org/10.1080/13562517.2017.1332026

Winne, P. H., & Jamieson-Noel, D. (2002). Exploring students’ calibration of self reports about study tactics and achievement. Contemporary Educational Psychology, 27(4), 551–572. https://doi.org/10.1016/s0361-476x(02)00006-1

Wu, J. H., Tennyson, R. D., & Hsia, T. L. (2010). A study of student satisfaction in a blended e-learning system environment. Computers & Education, 55(1), 155–164. https://doi.org/10.1016/j.compedu.2009.12.012

Yörük Açıkel, B., Turhan, U., & Akbulut, Y. (2018). Effect of multitasking on simulator sickness and performance in 3D aerodrome control training. Simulation & Gaming, 49(1), 27–49. https://doi.org/10.1177/1046878117750417

Zhao, Q., & Linderholm, T. (2008). Adult metacomprehension: Judgment processes and accuracy constraints. Educational Psychology Review, 20, 191–206. https://doi.org/10.1007/s10648-008-9073-8

Zhou, M., & Winne, P. H. (2012). Modeling academic achievement by self-reported versus traced goal orientation. Learning and Instruction, 22(6), 413–419. https://doi.org/10.1016/j.learninstruc.2012.03.004

Zhu, M., Sari, A. R., & Lee, M. M. (2020). A comprehensive systematic review of MOOC research: Research techniques, topics, and trends from 2009 to 2019. Educational Technology Research and Development, 68, 1685–1710. https://doi.org/10.1007/s11423-020-09798-x

Published

2023-09-06

How to Cite

Akbulut, Y., Saykılı, A., Öztürk, A., & Bozkurt, A. (2023). What If It’s All an Illusion? To What Extent Can We Rely on Self-Reported Data in Open, Online, and Distance Education Systems?. The International Review of Research in Open and Distributed Learning, 24(3), 1–17. https://doi.org/10.19173/irrodl.v24i3.7321

Issue

Section

Research Articles