October - 2003

Technical Evaluation Report

21. 100 Collaborative Products and their Uses

Jon Baggaley
Professor and Series Editor
Athabasca University – Canada’s Open University

Jim Depow, Jim Klaas and Norine Wark
Masters of Distance Education Program
Athabasca University – Canada’s Open University

Abstract

This report highlights trends that have emerged from the evaluation of 100 online collaborative tools in this series of reports so far (2001-03). Emphasis is placed upon the special requirements of distance education (DE) users of collaborative tools, in the selection of online text/ audio/ video-conferencing, polling and whiteboard methods, and integrated course delivery systems combining all of these features. The technical and didactic skills for using collaborative tools effectively are illustrated in relation to a standard freeware for online audio-conferencing.

Introduction

Many collaborative software tools are initially developed for other markets: e.g., the lucrative corporate training industry and for campus-based uses. Criteria for software selection in these contexts, however, and those of DE usage, are different. Software that works well in an expensively equipped central laboratory, for example, may not work at all for students who are restricted to using less sophisticated home computers in their online studies; and software vendors are often unaware of, and fail to acknowledge these problems in promoting their products to the DE sector. DE students, therefore, provide an important perspective on the tools’ benefits and shortcomings, and enable the DE teacher to select software that works efficiently on a wide range of student Internet platforms. The current series of evaluation reports and its accompanying website were established in late 2001 to identify the available products and services for making online DE optimally interactive and collaborative, and to evaluate the tools from the DE student viewpoint. As the project ends its second year, the number of collaborative tools reviewed so far has reached 100. We take this opportunity to identify trends observed during the evaluation project to date.

Types of Collaborative Tool

  1. Text-conferencing: This is the oldest and most basic form of online conferencing. One of pioneering text-conference software was CoSy, conceived at the University of Guelph, Canada, in 1983. Today’s text-conferencing tools range from simple threaded formats to elaborate systems involving user and administrative support features. Given the choice, the DE students surveyed during the current project have invariably chosen simple text-conference formats requiring little learning effort (click here to read Report VIII ).

  2. Audio-conferencing: ‘Internet telephone’ tools, which became available in the mid-90s, are usually restricted to one-on-one conversations. Many early audio products, which were restricted to two participants, required each other to check the other’s Internet Protocol address (IP) in order to connect. Since one’s IP can differ with each Internet connection, this method tended to be cumbersome. Today’s audio-conferencing methods provide access to numerous online participants at the click of a single icon, and usually provide parallel “text-box” facilities in support of the audio discussion.

  3. Video-conferencing: Most of today’s audio-conferencing tools also provide the option to make one’s Web-camera image available to other participants. This feature, however, can cause computers with lesser “random access memory,” or RAM, to freeze up, and in most DE situations the video image is a novelty that soon loses its appeal. Numerous freeware messaging products now include a video-conferencing option with good audio-visual quality.

  4. Whiteboards: These tools provide a blank display on which conferencing participants can type, draw with a mouse or graphics tablet, visit websites together (co-browsing), and contribute simultaneously to the display’s modification. Standard whiteboard tools are available at no cost online, allowing remote users to collaborate on projects while conversing using an audio-visual-conferencing tool.

  5. Polling tools: Numerous software products and services allow users to create questionnaires, surveys, quizzes, and other types of polls, and to feed the results back to respondents either instantly or subsequently. These tools can give the DE teacher and students rapid, nonverbal analyses of a group’s thinking. Polls can be designed in advance or administered “on-the-fly,” though such polls need to be designed carefully in order to ensure that they yield valid and reliable conclusions.

  6. Course delivery systems: During the past five years, all of the above features have been combined into integrated software packages for the administration of online learning processes. (Click here to check Report V in the series, which identified 31 such products.) Their aggressive marketing and high cost have become major issues in the educational sector. This trend is similar to that observed in the selection of educational hardware in the 1970s, when separate gadgets (e.g., tape-recorders and slide-projectors) were combined into single devices. The relative clumsiness and high cost of these integrated hardware systems caused the market to return to more flexible “stand-alone” gadgets. During the 2000s, the online software market is moving in the same direction with the development of integrated ‘open source’ tools (Please click here to visit Reports XXIV and XXV).

Best Practices in Online Conferencing

The evaluations reported in the series so far have had a direct impact on the practices of the graduate school that hosts the project: Athabasca University’s Masters in Distance Education (MDE) Program. In the late 1990s, MDE instructors used asynchronous, text-based methods of online collaboration alone. Since 2000, their courses have adopted a selection of course delivery systems (e.g., Elluminate, WebCT, and Wimba), and an increasing range of freeware tools (e.g. GroupBoard, PalTalk, Sonork, and Yahoo Messenger), which provides similar functions. Six of the 100 products evaluated between 2001 and 2003 have since failed. Otherwise the market has remained stable, while seeing an explosion of new and largely overlapping competitive products. One of the failed products, FireTalk, was arguably the most sophisticated audio-conferencing tool yet developed. Its demise indicates that even the most robust technical product can fail owing to market forces, and provides a warning to institutions that may be tempted to lock themselves into an investment in an expensive commercial product, rather than retaining the flexibility that accompanies the use of good-quality freeware.

In most cases, the MDE Program’s software selections are the direct result of the evaluation project’s recommendations. The project has provided similar assistance, and a greater awareness of the available collaborative tools, to distance educators and students worldwide. The selection of a good software package, however, only goes part way towards developing effective online practices. Expensive software and freeware alike can be rendered ineffective by inefficient usage, thus the importance of developing user skills and protocols cannot be overstated. Numerous advisories have been published on the skills of effective teleconferencing (see the website’s “Sources” section), although at this stage most do not relate to the specific challenges of online conferencing. The online moderator in particular requires a complex set of “multi-tasking” skills, similar to those used in a TV control room, where a director must continually scrutinize the broadcast output while lining-up the stimuli that will be used moments ahead. The software evaluation teams involved in the current project (MDE Program graduate students) develop these moderating skills in testing the software options, and formulate guidelines for their usage.

The following is a list of the recommended “best moderating practices” based on the research, trials, and tribulations of one of these evaluation teams. It relates specifically to the conferencing activities underlying most current online collaborative approaches.

1. Technical pre-meeting:

  1. As far as possible, obtain details of the hardware configurations, connection speeds, and operating systems of the conference participants, and ensure that these meet the requirements of the selected software.

  2. Provide participants with a guide for software downloading, installation, and instructions on how to add one another to their user list.

  3. Encourage first-time participants to pre-test the software at least 24 hours before the meeting, including running the “audio set-up wizard” as appropriate.

  4. Urge participants to restart their computers 15 minutes before the conference, and not to open unnecessary applications (e.g., email) during the conference.

  5. Ask participants to log-on to the collaborative area at least five minutes before the conference session for a set-up check.

2. Didactic pre-meeting:

  1. Ensure participants have a session agenda in advance, specifying the preparation required and the session’s expectations.

  2. Users should also have a list of participants with their actual names so they can interpret screen ID names.

  3. Groups of 10 or more participants are not recommended for novice moderators, owing to the difficulty of keeping track of their text-box postings and speaking order. Unless instructed otherwise, a few students will dominate the discussion, while the majority will “lurk” (Tolley, 2000).

3. Technical meeting:

  1. Identify one participant as a technical assistant, whose assignment is to send personal text messages to users who are having trouble obtaining or maintaining their connection.

  2. Ask participants to restrict their use of “text-boxes” to central issues of conference coordination, questions, etc. Side chats can have the distracting effect of “whispering in class.”

  3. Ask participants not to send the moderator private messages as this will disrupt the main conference, and to send technical comments to the assistant moderator.

  4. If participants invite others to a side chat, or accept such invitations, they must be aware that they may lose the audio connection to the main conference.

  5. Suggest that participants only use a “hands free” audio option when actually speaking, owing to the feedback it can produce for other participants.

  6. Provide the facilitator with useful shorthand messages (as provided in some software packages).

4. Didactic meeting:

  1. Clarify the protocol for participation. If the audio software program does not feature a “raised hand” icon, explain the use of shorthand messages (if provided).

  2. Invite participants to state in the text-box if they lose audio.

  3. Check audio transmission and reception periodically throughout the session, as it may come and go without warning.

  4. Do not talk for extended periods without relaxing the “talk” button; at times of busy Internet traffic, this will relieve congestion and signal break-up.

  5. Give the participants time for feedback. Use open-ended questions to encourage discussion, and direct the question to specific individuals if necessary.

  6. Post agendas, dates, article names, Web addresses, and other important information in the text-box, in case participants lose audio connection or are have difficulty with spellings, etc.

  7. Summarize discussion threads to clarify audio and text conversations.

  8. Save text-box transcripts for future reference.

5. Technical post-meeting:

  1. If it is impossible to warn participants in advance of the hardware and connection speeds required by the conferencing software, check with them subsequently about any technical problems they may have experienced. Such follow-up allows the teacher to identify the resources demands of specific collaborative tools, and to diagnose participants’ technical problems in using them.

6. Didactic post-meeting:

  1. To facilitate continued reflection and feedback about the learning materials and process, provide participants with a supportive bulletin board, or other forms of online communication.

Conclusion

The above user guidelines illustrate that technical as well as pedagogical skills are essential to the efficient use of online collaborative tools, as they are in the use of any educational medium. Johnston (as quoted by Tolley, 2000), states: “We need to take seriously the pedagogical issues arising out of teaching by online courses . . . But we are some way off mastering this new domain, and in the meantime we need to be mindful of the snags and pitfalls hereabouts. Let us be converted, but by deeds and not by faith alone.” The current software evaluation project will continue to uphold this maxim as a useful teaching and research activity, and to provide support for international distance educators and their students.

References

Tolley, S. (2000). How Electronic Conferencing Affects the Way We Teach. Open Learning, November. Milton Keynes, UK: Open University. Retrieved October 13, 2003 from: www.otis.scotcit.ac.uk/casestudy/tolley.doc

The next report in the series discusses the potential of polling tools in online collaboration.

N.B. Owing to the speed with which Web addresses are changed, the online references cited in this report may be outdated. They can be checked at the Athabasca University software evaluation site: cde.athabascau.ca/softeval/. Italicised product names in this report can be assumed to be registered trademarks.

JPB. Series Editor, Technical Notes