This report provides an introduction to online polling in its various forms (questionnaires, quizzes, surveys, assessment products, etc.), and discusses its advantages and problems in online education.
The advent of online technologies during the 1990s has led to the development of numerous new automated data collection techniques and pre-configured Web polls (Ostendorf, 1994). These tend to emulate hand-held keypad systems used for anonymous polling in political and advertising research (Baggaley, 1997). Uses of the term “poll” differ widely. Mancinelle (2003) suggests that polls refer to a single question, while surveys are more complex. An earlier report in the current series (click here for Technical Report XII) however, has recommended the use of the term “online polling” in referring generally to “questionnaires, quizzing, survey and assessment products” (Baggaley, Kane, and Wade, 2002). The online format typically associated with these activities, is one in which participants place closed-ended “votes” in response to fixed questions or statements, and in which the votes are counted. The current use of “polling” as a generic term is thus consistent with the definition of “polling” provided by The Oxford Dictionary (Sykes, 1976), as being associated with voting and mediated by the counting of ballots. For the purposes of the current discussion, an online polling system may be further defined as an asynchronous or real-time process of information gathering, obtained via responses to question(s) mediated by Web-based formats.
The current authors have identified over 100 online polling tools to date. Some products offer a reduced-capability, free version that permits limited polling, with restrictions on the software’s features, and the number and length of the instrument generated. Typically, the software either generates Hyper Text Markup Language (HTML) code for posting on websites, or the software developer hosts the poll and a Universal Resource Locator (URL) is sent by email to prospective respondents, with an invitation to participate. Access codes and other programming tools can be used to prevent unwanted or repeat responses. A variety of question templates (e.g., yes/ no, multiple choice, open answer, forced ranking, Likert scale questions, and paired comparisons) are available, and “themed” templates (e.g., course evaluations and project planning tools). Polls may be designed to require the completion of all items, and to accept only specific types of data such as numbers or letters. Polling tools vary in terms of the ease of poll construction, extent of customised reporting, degree of feedback available to the respondents, personalized greetings, and branding by the entity using the software (Bonk, 2003).
Email polls embedded in the body of the message have been found to produce a five-fold increase in response over those sent as attachments (Moss and Hendry, 2002). According to Kehoe and Pitkow (1996), “implementation of HTML forms turned the Web into a two-way medium to contact the audience directly.” Online polling is regarded as having advantages over the pencil-and-paper alternative, including savings of time and money, and fewer data collection errors (Solomon, 2001). It has been described as yielding faster responses, permitting adaptive responses whereby the questions can be changed according to the users’ input (Watt and Van Den Berg, 1995), and reducing fatigue by the use of easy click-response methods and colour graphics (Bonk, 2003). Handverk, Carson, and Blackwell, (2000) have suggested that Web respondents seem more comfortable with providing comments than mail-in poll respondents, possibly owing to “an additional sense of confidentiality.” Carbonaro, Bainbridge, and Wolodko (2002) describe advantages of online polling such as built-in security methods and user-friendly editing features (e.g., copy/ paste, data processing, storage and display). Hitherto, Web polling has been regarded as less suspect than telephone surveys in terms of hidden sales motivation (Yun and Trumbo, 2000), although this may change (see next section). A cost-effectiveness benchmark favouring online polling over more traditional methods has been provided by Dillon (2001).
Significantly for distance education (DE) users, online polling has been regarded as helping to build online communities (Kvitka, 1999). Baggaley, Kane, and Wade (2002) have indicated that online polling can contribute to immediate satisfaction and camaraderie in synchronous discussion. Email surveys have also been described as providing a space for reflective conversation and “an exchange of ideas in which the expression and receipt of ideas leads to the construction of new understanding of their own experience among the participants” (Heflich and Rice, 1999). Baggaley (1997) described real-time polling methods in general as yielding frank and confidential responses on sensitive or embarrassing issues such as AIDS, and pointed out that the instant analysis and feedback of real-time polling results can provide timely feedback of individual and group opinions that, in turn, can guide the forum moderator. On balance, it is evident that the World Wide Web has created “an international and amorphous interaction of human agents through the digital transmission of information” (Witmer, Colman, and Katzman, 1999) and is ideal for the sharing of opinions, ideas, and knowledge. Web-based polling can enhance this process and add “collaborative power” to learning (Bonk, 2002).
At present, the programming ability required by some polling software packages is beyond the scope “of most educational researchers, including those who specialize in technology,” and of browsers that do not fully support JavaScript (White, Carey, and Dailey, 2000). The use of complex software features may decrease Web response because of the technical problems and frustrations they can cause for inexperienced users. Web congestion can limit response rates (Solomon, 2001), as can slow Internet connections in the downloading of lengthy instruments and graphic files.
In addition, Carbonaro, Bainbridge, and Wolodko (2002) state that educational survey research conducted via the Web is still largely “devoid of study.” Out of 24 newspapers running quick polls, only two used a disclaimer explaining that the poll was unscientific (Schultz, 1999). From the statistical viewpoint, however, skeptical writers suggest that “most of the self-selected, online polls are worthless” and do not usually meet scientific standards (Rosenblatt, 1999). Online polls commonly involve sample/ coverage bias (Solomon, 2001), whereby the polling sample fails to represent the target population due to the exclusion of individuals who cannot or do not choose to access the Internet. Bonk (2003) points to similar design flaws in online polling implementation, including failure to give respondents clear instructions and accompanying URLs. However, any method is as good or as poor as its users, and online polling methods are as susceptible to refinement as any data collection method. Sampling bias, for example, can be reduced by the use of multimode survey techniques (Yun and Trumbo, 2000). Currently, writers differ on basic methodological issues relating to online polling. While Frary (2003) does not recommend the use of open-ended responses, Yun and Trumbo indicate that Web poll responses to well-designed open-ended questions can be more substantial and more self-disclosing than those elicited by mail-in methods. Schultz (1999) suggests that: “If the audience is informed of these deficiencies, online polls could still be used as a means to ignite and channel discussion.”
Usage patterns of online polling are shifting, however. Sax, Gilmartin, and Bryant (2003) have shown that recent online polls tend to have a lower response rate than print polls among students. Moss and Hendry (2002) note that response rates for email surveys appear to be declining apace with the growing increase in email traffic. They argue that “Internet savvy” users may have a shorter attention span than users of print polls, may be subject to more online distraction, and may be aware that polling costs are passed on to users who pay for Internet access and download time. Moss and Hendry also note that password access can reduce response rate. A major current issue for those interested in using online polling is thus the development of “best practices” (see Technical Report XXIII in this series).
Baggaley, J. P. (1997). From Madison Avenue to the Field: Cross-cultural uses of media research technology. In M. E. Goldberg, M. Fishbein, and S. E. Middlestadt (Eds.) Social marketing theoretical and practical perspectives. London: Lawrence Erlbaum.
Baggaley, J. P., Kane, T., and Wade, W. (2002). Online Polling Services. International Review of Research in Open & Distance Learning 3(2), Retrieved October 7, 2003 from: http://www.irrodl.org/content/v3.2/tech12.html
Bonk, C. (2002). Collaborative tools for e-Learning. Chief Learning Officer, November. Retrieved October 7, 2003 from: http://www.clomedia.com/content/templates/clo_feature.asp?articleid=41&zoneid=30
Bonk, C. (2003). Surveying Web surveying tools: features and advice. Online Learning Conference, Sept 20-25, Los Angeles. Retrieved October 7, 2003 from: http://www.onlinelearningconference.com/handouts/handout_151837.ppt
Carbonaro, M., Bainbridge, J., and Wolodko, B. (2002). Using Internet surveys to gather research data from teachers: trials and tribulations. Australian Journal of Educational Technology, 18(3), 275 – 292. Retrieved October 7, 2003 from: www.ascilite.org.au/ajet/ajet18/carbonaro.html
Dillon, L. (2001). Online Surveys: lessons learned. Centres for IBM e-business Innovation website. Retrieved October 7, 2003 from: www.the-cma.org/council/download-council/2001_ibm_lessons_learned.pdf
Frary, R. B. (2003). A Brief Guide to Questionnaire Development. Virginia Polytechnic Institute & State University. Retrieved October 7, 2003 from: www.ericae.net/ft/tamu/vpiques3.htm
Handverk, P., Carson, C., and Blackwell, K. (2000). Online vs. Paper-and-Pencil Surveying of Students: a case study. AIR 2000 Annual Forum paper. ERIC Document # RIEAAPR2001.
Heflich, D., and Rice, M. (1999). Online Survey Research: a venue for reflective conversation and professional development. Society for Information Technology & Teacher Education. ERIC Document # RIEDEC1999.
Kehoe C. M., and Pitkow, J. E. (1996). Surveying the territory: GVU’s five WWW user surveys. World Wide Web Journal, 1(3). Retrieved October 7, 2003 from: www.cc.gatech.edu/gvu/user_surveys/papers/w3j.html
Kvitka, A. (1999). Web Polling and Survey Software. Reprinted in Infoworld (October 2003). Retrieved October 7, 2003 from: http://archive.infoworld.com/articles/ic/xml/99/11/15/991115icviews.xml
Mancinelle, B. (2003). Reach Out and Ask Someone: how to leverage online surveys to strengthen member relationships. WebSurveyer.com. Retrieved October 7, 2003 from: http://www.websurveyor.com/pdf/AssociationWhitePaper.pdf
Moss, J. and Hendry, G. (2002). Use of electronic surveys in course evaluation. British Journal of Educational Technology 33(5), 583 – 592. ERIC Document # CIJAAPR2003.
Ostendorf, V. A. (1994). Use of Response Systems in Distance Learning. Proceedings of the 1st International Conference on Open Learning. Brisbane, Australia: ISDL accession # 00008186.
Rosenblatt, A. J. (1999). On-Line Polling: methodological limitations and implications for electronic democracy. Harvard International Journal of Press/Politics, 4(2), 30 – 44.
Sax, L., Gilmartin, S., and Bryant, A. (2003). Assessing response rates and nonresponse bias in Web and paper surveys. Research in Higher Education, 44(4), 409 – 432.
Schultz, T. (1999). Interactive Options in Online Journalism: a content analysis of 100 US newspapers. University of Bremen: Institute for Intercultural and International Studies. Retrieved October 7, 2003 from: http://www.ascusc.org/jcmc/vol5/issue1/schultz.html
Solomon D. J. (2001). Conducting Web-based Surveys. Office of Educational Research & Development. Washington DC. ERIC Document #: 458291.
Sykes, J. B. (Ed.) (1976). The Concise Oxford Dictionary of Current English. Oxford: Clarendon.
Watt, J., and Van Den Berg, A. (1995). Research Methods for Communication Science. Needham Heights, MA.: Simon & Schuster.
White, J., Carey, L. .M., and Dailey, K. (2000). Improving Web-based Survey Research Data Collection. 2000 Annual Meeting of the American Educational Research Association, New Orleans, LA. Retrieved October 7, 2003 from: www.coedu.usf.edu/jwhite/survey1/SURV-E-2.pdf
Witmer, D. F., Colman, R. W., and Katzman, S. L. (1999). From paper-and-pencil to screen-and-keyboard. In S. Jones (Ed.) Doing Internet Research: Critical issues and methods for examining the Net. Thousand Oaks, CA.: Sage.
Yun, G. W. and Trumbo, C. (2000). Comparative response to a survey executed by post, e-mail & Web form. Journal of Computer-Mediated Communication, 6(1). Retrieved October 7, 2003 from: http://www.ascusc.org/jcmc/vol6/issue1/yun.html
The next report in the series discusses the “best practices” of online polling.
N.B. Owing to the speed with which Web addresses are changed, the online references cited in this report may be outdated. They can be checked at the Athabasca University software evaluation site: cde.athabascau.ca/softeval/. Italicised product names in this report can be assumed to be registered trademarks.
JPB. Series Editor, Technical Notes