Volume 18, Number 6
Chris Meintzer1, Frances Sutherland1, and Dietmar K.
Kennepohl2
1Northern Alberta Institute of Technology, 2Athabasca
University
The Canadian Remote Sciences Laboratories (CRSL) website (www.remotelab.ca) was
successfully employed in a study of the differences in the performance and perceptions
of students' about their learning in the laboratory (in-person) versus learning at a
remote location (remote access). The experiment was completed both in-person and via
remote access by 70 students, who performed essentially the same, academically, in the
two modes. One set of students encountered the in-person laboratory first and then did
the remote laboratory, while the other set of students did the activities in the
reverse order. The student perception survey results
Keywords: distance learning, Internet, undergraduate laboratory instruction, atomic spectroscopy, remote laboratory
Remote access to modern chemistry laboratory equipment is important to chemical industry and chemistry education (Kennepohl, 2010). Most industry analyses are now completed using software controlling automated instruments, which affords the opportunity to operate chemistry instruments remotely; either in hazardous environments or by on-line monitoring in remote locations of a chemical production facility. Chemistry instructors at universities and colleges are beginning to take advantage of this opportunity as well (Barot et al., 2005; Szalay, Zeller, & Hunter, 2005; van Rens, van Dijk, Mulder, & Nieuwland, 2013; Saxena & Satsangee, 2014). There are at least two reasons for this. First, since many chemical technologists will now be operating laboratory instruments in industry via a remote connection (either LAN or WAN), it is valuable to include remote-access activities in the curriculum (Baran, Currie, & Kennepohl, 2004). Second, remote access to laboratory instruments provides educational institutions the ability to deliver distance education on these instruments, as well as instrument sharing and after-hours access for students on equipment with high scheduling demands (Albon, Cancilla, & Hubball, 2006). Initial studies on remote laboratories for teaching over the Internet focused primarily on the technology, feasibility, and the ability to have the learner connect with and control an experiment remotely. The current emphasis is more on pedagogy and learning design to determine how to facilitate a high level of student learning and skills-development at a distance (Azad, Auer, & Harward, 2011). To provide some insight into learning in a remote laboratory environment we look to another related mode of laboratory delivery that has been more thoroughly studied—namely the computer simulation.
For many years, the use of computer simulations to create virtual laboratories has been extensively explored and is relatively well understood in the sciences (Smetana & Bell, 2012). In chemistry, recent reports have looked at differences in how students learn when using animations and simulations (Kelly & Jones, 2008; Akaygun & Jones, 2013). These tools aid in student understanding of the behaviour of chemicals without the cost and laboratory time required for in-the-laboratory exploration. The literature discussion has moved away from arguments of which mode is better (in-person versus simulation) towards the realization that an appropriate combination of the two often provides richer learning opportunities than either mode on its own. For many years, the natural assumption for the order of this combination had been to treat the simulation as a pre-lab exercise followed by the real laboratory experience. However, the increased benefit has been seen irrespective of the order of mode (Jaakkola & Nurmi, 2008; Zacharia, Olympiou, & Papaevripidou, 2008) and one group has even reported that the best combination in their teaching laboratory was to do the real experiment first before doing the simulation (Smith & Puntambekar, 2010). In a similar fashion to computer simulations, would blending remote laboratory work with onsite laboratory work afford any benefits and would the order in which a student is exposed to onsite versus remote laboratories be different from a learning perspective?
It is important to note that in contrast to computer simulations, remote control of laboratory instruments allows students to analyze real samples, on real instruments with resulting real behaviour. Chemical educators are interested in the effects that remote-access instruction in instrumental analysis has on student learning and students' attitudes to learning. However, research in both remote laboratory design (Cagiltay, Aydin, Aydin, Kara, & Alexandru, 2011; Lindsay, Murray, & Stumpers, 2011) and evaluation of student learning (Ma & Nickerson, 2006; Nickerson, Corter, Esche, & Chassapis, 2007; Elawady & Tolba, 2009) in different laboratory modes is meagre in chemistry and is being led by disciplines, such as computing science and engineering. Indeed, a world-wide remote controlled laboratory inventory indicates about 60% are sites from robotics, computing, and engineering fields, 30% from physics, and 10% from all other science disciplines (Gröber, Vetter, Eckert, & Jodl, 2007).
The framework of teaching and learning in chemistry is steeped in constructivist theory (Bailey & Garratt, 2002). This is especially true of the laboratory component which attempts, among other things, to bring together and mutually strengthen theory and practice. Three important principles are assumed in this construct namely (1) there is a direction to learning which continually builds on previous knowledge, (2) knowing and doing are intimately linked so the laboratory activity provides situated cognition, and (3) student perception can influence the construction of knowledge. In this paper, we report an investigation of the differences in learner performance and perceptions in the laboratory (in-person) compared with a remote location (remote access) in equivalent analytical chemistry experiments. We also examine whether the order of the mode of the experiment the student first encounters (in-person or remote access) plays a role in learning or learner attitude in the laboratory. Student performance was tracked and students were surveyed as to their perceptions of ease of operation, quality of support materials, how much they felt they learned in both scenarios, and how much they felt that they learned from their laboratory partner in both scenarios.
This study applied for and received approval from both the NAIT and Athabasca University Research Ethics Boards. It was designed to develop a laboratory experiment that could be performed both in the laboratory and by remote control, and to investigate student performance and perceptions as to how they learn differently under the two conditions.
Students (n = 70), with the exception of two individuals, worked in self-selected pairs. Some student pairs would complete the in-laboratory exercise first and then complete the remote-access exercise. The other student pairs completed the remote-access exercise first and then the in-laboratory exercise. The order of the remote-access versus in-laboratory work was assigned randomly. Pair performance (based on laboratory report grades) was tracked and any operational issues with using the remote environment were noted by the instructors. In addition, all students were asked to voluntarily complete two surveys that measured their perceptions and comparisons of the two experiences once both remote and in-person experiments had been completed. No student personal information was collected in this study and the surveys were anonymously coded by the students such that the researchers could link the surveys to the same participant, but not identify the participant. The survey tools asked the students to identify whether or not they had completed the in-laboratory exercise or the remote-access exercise first. Student perceptions were measured through the use of Likert Scale questions and comment boxes. The survey tools are provided in the Supporting Materials.
The data collected from individuals
The NAIT Chemical Technology program has a website that allows remote access to a variety of chromatographic and spectroscopic instruments including gas chromatographs (FID, MS, PFPD, TCD detectors), a liquid chromatograph (triple quad MS detector), as well as NMR, ICP-OES, and ICP-MS spectrometers. All of the accessible instruments are equipped with autosamplers. Access to an instrument is controlled by a website administrator, who grants appointments and log-in credentials to qualified operators. The website also includes facilities to allow users to request bookings, and to view and manage personal bookings. To control an instrument through the website, the client accesses the site (www.remotelab.ca) through their browser. The request is sent to an F5 switch and redirected to a RemoteLab web server (Figure 1). The RemoteLab server uses Active Directory as the authentication method for access to the website. If the client's username and password are valid, they are allowed to proceed. If the client has a valid reservation on an instrument, the RemoteLab website sends the request to a VMWare server. If on the first time a client accesses an instrument they do not have the VMWare software installed on their device, the VMWare server begins to initiate installation of the software on their computer. The VMWare server manages the client's access to the laboratory computer from this point on (Figure 1).
Figure 1. Remote analytical instrument access.
The Inductively Coupled Plasma - Optical Emission Spectrometer (ICP-OES) instrument was chosen. Experiments and learning support materials were developed for the students included the laboratory experiment procedures, as well as two tutorials (with animations) and study games. The laboratory experiment procedures included introductory theory concepts as well as the instructions for completing the laboratory exercises.
The in-person experiment involved studying the interference effects of iron on tin analyses. In this experiment the students prepared iron (III) nitrate (Fe(NO3)3•9H2O) and tin (IV) chloride (SnCl4•5H2O) in an acidic matrix. The students also prepared analytical solutions containing just tin, and various concentrations of tin and iron. The analytical solutions were designed to illustrate the instrumental results when only tin was present versus solutions where varying amounts of iron as an interferent was present. In a second part of this experiment the students utilized a software algorithm that allows the iron to be listed as an interferent. In this section the instrument software monitors the emission of both tin and iron individually and calculates a correction factor to remove the effects of interference by the iron. The students repeated the analysis of the samples in this manner and observed the new values for tin concentration (either tin wavelength) due to the correction factor applied by the software. The students also used the instrument data to calculate their own correction factor and compare their factor to the one reported by the software.
In the remote experiment, essentially the same procedure was followed. The same solutions were prepared, and the same instrumental analyses were performed. The main difference was that once the solutions had been prepared and loaded into the instrument autoanalyzer trays, the students then went to a remote location and ran the instrument software through the Canadian Remote Sciences Laboratories website. Once the operation of the instrument was complete, the students returned to the laboratory to clean up their samples, place the instrument into a standby state, and to collect their instrument printouts. The students had the option of using a Chemical Technology program notebook computer or one of their own.
The information acquired from the study can be grouped into three categories: (1) instrument/website operation, (2) student performance, and (3) student perceptions.
A few issues arose during the remote experiment preparation and student participation phases. First, the booking system of the CRSL website only allows one user ID to be logged into the computer operating the laboratory instrument at one time. It also does not have the capability of automated scheduling. This would mean manual reservations with separate IDs would need to be made centrally by a site administrator to allow for completely secure appointments of the instrument. For the purposes of this initial study, we chose to use a single generic login ID for all student pairs to streamline the reservation process. Second, due to security concerns at NAIT the remote access to webcams outside the NAIT system was not allowed. Two separate incidences occurred (malfunction of the autosampler and improper connections to the peristaltic pump), which would have been more readily apparent to the students at their remote location if they could have observed the instrument operation via a webcam. Third, the user's computer must install and run VMWare software to allow the website to access the computer directly connected to the laboratory instrument as a virtual machine. This is an additional step for the student. However, with the exception of printing files remotely using a Mac Notebook, the software appears to work well with a variety of devices. Fourth, for the ICP-OES instrument the remote control connection provided excellent instrument stability. Regardless of remote computer crash/closure or dropped wireless connection, once the network connection was re-established the instrument was found to be continuing its last task.
Tables 1 and 2 present the academic performance (laboratory report grades) of student pairs that completed the in-person experiment first versus those who completed the remote access experiment first.
Table 1
Laboratory Report Grades for Student Pairs That Completed the In-Person Activity First
Student group | In-person grade (%) | Remote grade (%) | Improvement on remote laba |
1 | 91.5 | 89.5 | −2.0 |
2 | 66.5 | 80.5 | 14.0 |
3 | 80.5 | 65.0 | −15.5 |
4 | 69.5 | 67.0 | −2.5 |
5 | 54.0 | 68.5 | 14.5 |
6 | 92.0 | 81.5 | −10.5 |
7 | 71.0 | 71.0 | 0.0 |
8 | 77.5 | 88.0 | 10.5 |
9 | 83.5 | 84.5 | 1.0 |
10 | 44.5 | 72.5 | 28.0 |
11 | 94.5 | 97.5 | 3.0 |
12 | 88.5 | 89.0 | 0.5 |
13 | 77.5 | 94.5 | 17.0 |
14 | 82.0 | 88.0 | 6.0 |
15 | 80.5 | 69.0 | −11.5 |
16 | 80.5 | 79.0 | −1.5 |
17 | 80.5 | 90.0 | 9.5 |
18 | 78.0 | 77.5 | −0.5 |
19 | 80.5 | 70.5 | −10.0 |
20 | 69.5 | 69.0 | −0.5 |
21 | 87.5 | 78.0 | −9.5 |
22 | 83.0 | 76.5 | −6.5 |
23 | 90.5 | 93.0 | 2.5 |
Average | 78.4 | 80.0 | 1.6 |
a. The difference in average report grades (%); average remote access minus average in person. A positive result means the students scored better in their second experience. A negative result means the students scored better in their first experience.
Table 2
Laboratory Report Grades for Student Pairs That Completed the Remote Access Activity First
Student group | Remote grade (%) | In-person grade (%) | Improvement on inperson laba |
1 | 83.0 | 80.5 | −2.5 |
2 | 87.0 | 86.0 | −1.0 |
3b | 72.0 | 72.0 | 0.0 |
4 | 83.0 | 83.0 | 0.0 |
5 | 82.5 | 75.0 | −7.5 |
6 | 84.5 | 83.5 | −1.0 |
7 | 78.5 | 75.0 | −3.5 |
8 | 63.5 | 72.5 | 9.0 |
9 | 69.0 | 72.0 | 3.0 |
10 | 81.0 | 81.0 | 0.0 |
11 | 85.5 | 79.0 | −6.5 |
12 | 88.0 | 78.0 | −10.0 |
13b | 60.0 | 67.0 | 7.0 |
Average | 78.3 | 77.3 | −1.0 |
a. The difference in average report
grades (%); average in person minus average remote access. A positive result means the
students scored better in their second experience. A negative result means the students
scored better in their first experience.
b. Only one student in this group.
A total of 70 students were eligible to participate in this study. Of this number, 46 students completed consent forms and both surveys. Within the surveys responses to specific questions ranged from 42 to 46. When completing the experiment by remote access 94% of the students completed the activity somewhere on NAIT campus. The majority of students were satisfied with the instructional materials as presented requesting only some minor revisions 11% (in-lab) and 33% (remote). Table 3 summarizes the Likert scale responses to nine of the survey questions.
Table 3
Percent Student Agreement to Survey Questions a
Survey question | In-laboratoryb | Remote accessb |
Would recommend the experiment | 95.5 | 91.1 |
Appropriate difficulty | 91.3 | 86.9 |
Instructions easy to understand | 93.5 | 95.6 |
Preferred mode of experiment | 38.7 | 43.5 |
Helped to understand theory and concepts | 93.0 | 64.5 |
Developed practical hands-on skills | 97.6 | 84.8 |
Felt more learned in current mode of experiment | 52.2 | 17.4 |
Learned more from laboratory partner | 45.7 | 44.4 |
Easier instrument operation | 34.8 | 40.0 |
a. Represents two separate surveys
(in-laboratory and remote).
b. Sum of agree and strongly agree responses.
In addition, student comments were collected prompted by open-ended questions in the survey (see Appendix A). These general comments echoed information already collected from the responses to earlier specific questions, while also giving more anecdotal detail. In other cases, comments touched on areas not covered by set questions. Table 4 provides an overview summarizing (by comment type) some of the more common remarks. A majority of the students liked to see and hear the instrument at work in the laboratory. They liked or were impressed by their ability to control the instrument from a remote location, and they felt that they learned more from their partner when they could not just ask the instructor for clarification. Half of the students felt that they understood the theory of the experiment better when there was an instructor present and they were getting hands-on experience. A small portion of the students felt that the remote-access experiment was more stressful. Similar numbers of students mentioned that not being able to see the instrument working or having to trust that the instrument was working was stressful.
Table 4
Selected Summary of General Comments
General comment type | Respondents (%) |
Liked the ability to see the instrument at work in-laboratory | 66 |
Seeing the instrument in the laboratory helped them learn more | 30 |
Did not like inability to see the instrument in the remote access experiment | 29 |
Wanted a webcam added to allow observation for remote operation | 22 |
Disliked the crowded, noisy laboratory and having to wait to use the instrument | 30 |
Liked being able to control the instrument from somewhere else | 65 |
Found remote access experiment was more stressful | 24 |
Understood the theory of the laboratory better with an instructor and hands-on experience | 42 |
The scheduling and access to the ICP-OES instrument was achieved manually and with generic IDs. With approximately 30-35 students each term this is manageable, but a more automated and secure process would be preferable for larger classes. The lack of a webcam on the website was limiting to the student experience. Several of the open-ended comments from the student survey revolved around the desire to see the instrument (see Table 4). The ability to have a web-cam to monitor the autosampler would go a long way to relieving some of the anxiety associated with operating the instrument remotely. This further underscores the importance of visual feedback for the learner in a remote laboratory environment reported earlier (Baran et al., 2004). Printing from the CRSL website was performed through the laboratory network printer. This was mostly successful, but failed in a few instances. The causes of these failures have not been identified. Printing to a student home printer was not investigated in this study. The CRSL website was found to give stable control of the ICP-OES instrument even if the network providing the link to the CRSL website was not. Any time the network lost connection to the website the instrument continued performing any tasks it had been assigned or continued in its standby state until connection was re-established.
The ICP-OES instrument was effectively operated remotely with essentially no more physical operational issues than is normally encountered when students operate the instrument in the laboratory with instructor supervision. All 70 students successfully completed the sample analysis by the remote access experiment via the CRSL website. The experiment was completed using multiple PC computers and laptops and at least one MAC laptop computer (running Windows). Most of the student pairs obtained essentially the same grade on the experiment in both of the scenarios (in-person or remote access). A few pairs of students performed slightly better on the in-person report and a similar number of pairs performed better on the remote access report (Tables 1 and 2). However, as a group, the average grade for the remote or in-person mode was not statistically different (t-test). This was observed for the pairs that started with the in-person mode (Table 1), as well as those that started with the remote mode (Table 2). Overall, it would appear that performance on the laboratory report was more dependent on other factors in their student experience.
In addition, the student survey indicates that many aspects of each mode are perceived to be the same. The majority of students felt that the in-person and remote access activities were equally appropriate in their difficulty, were equally easy to understand and they would equally recommend the experiments to other students. The majority of students felt that both experiments helped them with their practical skills. Just under half (46% in-laboratory and 44% remote-laboratory) of the students felt they learned more from their laboratory partners in each of the two scenarios. The majority of the students were satisfied with the laboratory instructional materials for the experiments and felt at most only minor modifications were required (11% in-laboratory and 33% remote). Although it might appear that more students preferred the remote access experiment (44% vs 39%) and that more students found the instrument easier to operate in the remote access experiment (40% vs 35%), these results were not found to be statistically different (t-test). So, the order in which the students performed the experiment did not affect their performance or their preference of one experiment mode over the other.
Our initial expectation was that, analogous to the literature in computer simulations, different components of learning are better suited to different modes of delivery. On the whole, the evidence indicates that the laboratory experience for this experiment is equivalent irrespective of employing in-person or remote access to the instrument. This is not surprising in hindsight as remote access would be expected to be more similar to handling the real instrument compared with a computer simulation. However, there were a couple of clear differences in student attitude. Statistically more of the students felt that the in-person experiment helped them to understand the theory of ICP-OES better than the remote access experiment (93% vs 65%) and more students felt that they learned more in the in-person experiment than in the remote access experiment (52% vs 17%). While performance and meeting learning outcomes did not differ between modes, these survey results underscore the high value placed on the role of human interaction. Direct in-person student-teacher interaction is reduced via the remote mode and the student perception is that learning was also reduced. In contrast, because students worked in pairs (in both modes), a substantive amount of learning was attributed to the partner through student-student interaction. This is significant since learning from a peer has been recognized as a major contributor to successful learning (Johnson & Johnson, 1996; Hooper & Hannafin, 1991).
Furthermore, some of the general comments indicate that the remote access mode encouraged better understanding of the instrument software and provided some appreciated flexibility and autonomy for the learner. The influence of mode of laboratory delivery on a student's perception of what they have achieved compared with what they were meant to achieve has been previously described by Lindsay and Good (2005). A student's take on a particular learning environment has potential to shape the learning. The authors go on to report an increased acceptance of alternate modes of laboratory experiments by students who had experienced those modes, but with a continued "substantial bias" towards in-person labs (Lindsay & Good, 2005). Even though there are some stated perceived pros and cons, there was no overall preferred mode in our study.
It would appear to be safe to conclude that the students found the experiments to be similar in value, but that what they specifically learned in each of the experiments was not the same. They felt that they learned more about the theory of the experiment and operation of the instrument when they were in the laboratory with an instructor present. They felt that they learned more from their partner and about the software running the instrument when they were on their own at a remote location.
The design and operation of both experimental modes were mostly equivalent with the exception of a couple of technology issues (lack of automated scheduling and inability to print files remotely using a Mac Notebook) and one policy issue (access to live webcam of the instrument). However, for the ICP-OES instrument, the remote control connection provided excellent instrument stability, which is essential for running the experiment successfully. It would appear to be safe to conclude that the students found the experiments to be similar in value. Students learn equal amounts in both in-person and remote access approaches to instruction in the laboratory, but they learn some things differently. The in-person activities rely more on direct interactions with the instructor and leave students with the perception of having learned more hands-on skills and a better understanding of the theory of the experiment. The remote access activities leave them with the perception that they learned more about the operation of the instrument software. Since all students were eventually exposed to both modes of this experiment, they had the opportunity to make use of both approaches. The order in which students did the in-person or remote laboratories made no significant performance difference in this course.
Akaygun, S., & Jones, L. L. (2013). Animation or simulation: Investigating the importance of interactivity for learning solubility equilibria. In J. P. Suits & M. J. Sanger (Eds.), Pedagogic roles of animations and simulations in chemistry courses (pp. 127-159). Washington: American Chemical Society.
Albon, S., Cancilla, D., & Hubball, H. (2006). Using remote access to scientific instrumentation to create authentic learning activities in pharmaceutical analysis. American Journal of Pharmaceutical Education, 70(5), 121.
Azad, A. K. M., Auer, M. E., & Harward, V. J. (Ed.). (2011). Internet accessible remote laboratories: Scalable e-learning tools for engineering and science disciplines. Hershey, PA: IGI Global.
Bailey P.D., & Garratt J. (2002). Chemical education: Theory and practice. University Chemistry Education, 6(2), 39-57.
Barot, B., Kosinski, J., Sinton, M., Alonso, D., Mutch, G. W., Wong, W., & Warren, S. A. (2005). Networked NMR spectrometer: Configuring a shared instrument. Journal of Chemical Education, 82(9), 1342.
Cagiltay, N. E., Aydin, E., Aydin, C. C., Kara, A., & Alexandru, M. (2011). Seven principles of instructional content design for a remote laboratory: A case study on ERRL. IEEE Transactions on Education, 54(2), 320-327.
Elawady, Y. H., & Tolba, A. S. (2009). Educational objectives of different laboratory types: A comparative study. International Journal of Computer Science and Information Security, 6(2), 89-96.
Gröber, S., Vetter, M., Eckert, B., & Jodl, H.-J. (2007). Experimenting from a distance—Remotely controlled laboratory (RCL). European Journal of Physics, 28, S127-S141.
Hooper, S., & Hannafin, M. J. (1991). The effects of group composition on achievement, interaction, and learning efficiency during computerbased cooperative instruction. Educational Technology Research and Development, 39(3), 27-40.
Jaakkola, T., & Nurmi, S. (2008). Fostering elementary school students' understanding of simple electricity by combining simulation and laboratory activities. Journal of Computer Assisted Learning, 24, 271-283.
Johnson, D. W., & Johnson, R. T. (1996). Cooperation and the use of technology. In D. H. Jonassen, (Ed.), Handbook of research for educational communications and technology (pp. 170-198). New York: Simon & Schuster Macmillan.
Kelly, R. M., & Jones, L. L. (2008). Investigating students' ability to transfer ideas learned from molecular animations of the dissolution process. Journal of Chemical Education, 85(2), 303-309.
Lindsay, E. D., & Good, M. C. (2005). Effects of laboratory access modes upon learning outcomes. IEEE Transactions on Education, 48(4), 619-631.
Lindsay, E., Murray, S., & Stumpers, B. D. (2011). A toolkit for remote laboratory design & development. In Frontiers in Education Conference (FIE), Rapid City, SD, 12-15.
Ma, J., & Nickerson, J. V. (2006). Hands-on, simulated, and remote laboratories: A comparative literature review. ACM Computing Surveys, 38(3).
Nickerson, J. V., Corter, J. E., Esche, S. K., & Chassapis, C. (2007). A model for evaluating the effectiveness of remote engineering laboratories and simulations in education. Computers & Education, 49, 708-725.
Saxena, S., & Satsangee, S. P. (2014). Offering remotely triggered, real-time experiments in electrochemistry for distance learners. Journal of Chemical Education, 91(3), 368-373.
Smetana, L. K., & Bell, R. L. (2012). Computer simulations to support science instruction and learning: A critical review of the literature. International Journal of Science Education, 34(9), 1337-1370.
Smith, G. W., & Puntambekar, S. (2010). Examining the combination of physical and virtual experiments in an inquiry science classroom. Paper presented at the Computer Based Learning in Science (CBLIS) Conference, Warsaw, Poland. Retrieved from http://lekythos.library.ucy.ac.cy/bitstream/handle/10797/14520/B3_Smith_PHYSICAL%20AND%20VIRTUAL%20EXPERIMENTS%20INQUIRY%20SCIENCE_CBLIS_2010.pdf?sequence=1&isAllowed=y
Szalay, P. S., Zeller, M., & Hunter, A. D. (2005). The incorporation of single crystal x ray diffraction into the undergraduate chemistry curriculum using Internet-facilitated remote diffractometer control. Journal of Chemical Education, 82(10), 1555-1557.
van Rens, L., van Dijk, H., Mulder, J., & Nieuwland, P. (2013). Using a web application to conduct and investigate syntheses of methyl orange remotely. Journal of Chemical Education, 90(5), 574-577.
Zacharia, Z. C., Olympiou, G., & Papaevripidou, M. (2008). Effects of experimenting with physical and virtual manipulatives on students' conceptual understanding in heat and temperature. Journal of Research in Science Teaching, 45, 1021-1035.
The following is the script for the two online survey tools used in measuring student perception of their (1) in-person and (2) remote access laboratory experiences.
We are requesting your assistance in providing us with your views on the remote and/or in-lab experiments you have carried out in [these courses]. The results of the survey are being used both as a scholarly assessment of the effectiveness of these laboratory experiences and as feedback to further development and improve the experiments. We ask that you take a moment now to do the survey. It should take about ten to fifteen minutes to complete. Participation is completely voluntary, anonymous and confidential. You are free to discontinue participation at any time during the study. No one except the researchers and their supervisors will be allowed to see the answers to the questionnaires. There are no names on the questionnaires. Only group information will be summarized for any presentation or publication of results. The anonymous data will be stored on a password protected computer at XXXX for three years at which time it will be permanently erased. The results will be used to further develop this course and would benefit future students. Thank you in advance for your assistance.
Note: You should have completed both the [course] in-person experiment and the [course] remote access experiment before completing this survey.
Reminder: This is the first three letters of your mother's name followed by the first three letters of your month of birth (no spaces). For example, mother's name is Jane and month of birth is December - anonymous research code is: JanDec
I would recommend the in-person lab experiment to
others.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
The material covered and the degree of difficulty is appropriate for the
level of the course.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
The instructions were clear and easy to understand.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
Performing the in-person lab experiment has helped me understand the
theory and concepts underlying the experiment.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
Performing the in-person lab experiment has helped me to develop practical
hands-on skills.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
When performing the experiment, I preferred the in-person lab experiment
to the remote access lab experiment.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
What did you like most about the in-person lab experiment?
Be specific.
What did you like least about the in-person lab experiment?
Be specific.
I learned more performing the experiment via in-person access rather than
remote access.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
If you feel that you learned more in one mode of access to the experiment than the other, please describe here.
Be specific.
Compared to in-person, the remote operation of the instrument
was
More Difficult | Difficult | Similar in Difficulty | Somewhat Easier | Easy |
How would you rate your level of satisfaction with the instructional
materials provided in the in-person lab experiment?
Reading materials, videos, graphics, power point presentations
Did not use them | Very Dissatisfied | Dissatisfied | Neutral | Satisfied | Very Satisfied |
Based on your experiences with the materials, are there any changes you
would like to see made to the in-person lab experiment to improve its quality for
other students?
Major Revisions | Minor Revisions | No changes |
If you feel changes are necessary, please specify what changes you think should be made.
We are requesting your assistance in providing us with your views on the remote and/or in-lab experiments you have carried out in [the courses]. The results of the survey are being used both as a scholarly assessment of the effectiveness of these laboratory experiences and as feedback to further development and improve the experiments. We ask that you take a moment now to do the survey. It should take about ten to fifteen minutes to complete. Participation is completely voluntary, anonymous and confidential. You are free to discontinue participation at any time during the study. No one except the researchers and their supervisors will be allowed to see the answers to the questionnaires. There are no names on the questionnaires. Only group information will be summarized for any presentation or publication of results. The anonymous data will be stored on a password protected computer at XXXX for three years at which time it will be permanently erased. The results will be used to further develop this course and would benefit future students. Thank you in advance for your assistance.
Note: You should have completed both the [course] in-person experiment and the [course] remote access experiment before completing this survey.
Reminder: This is the first three letters of your mother's name followed by the first three letters of your month of birth (no spaces). For example, mother's name is Jane and month of birth is December - anonymous research code is: JanDec
Where did you access the remote lab website?
Home | |
Friends | |
NAIT | |
Other - please specify ______________________ |
I would recommend the remote access lab experiment to
others.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
The material covered and the degree of difficulty is appropriate for the
level of the course.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
The instructions were clear and easy to understand.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
Performing the remote access lab experiment has helped me understand the
theory and concepts underlying the experiment.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
Performing the remote access lab experiment has helped me to develop
practical hands-on skills.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
When performing the experiment, I preferred the remote access lab
experiment to the in-person lab experiment.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
What did you like most about the remote access lab experiment?
Be specific.
What did you like least about the remote access lab experiment?
Be specific.
I learned more performing the experiment via remote access rather than
in-person.
Strongly Disagree | Disagree | Neutral | Agree | Strongly Agree |
If you feel that you learned more in one mode of access to the experiment than the other, please describe here.
Be specific.
Compared to in-person, the remote operation of the instrument
was
More Difficult | Difficult | Similar in Difficulty | Somewhat Easier | Easy |
How would you rate your level of satisfaction with the instructional
materials provided in the in-person lab experiment?
Reading materials, videos, graphics, power point presentations
Did not use them | Very Dissatisfied | Dissatisfied | Neutral | Satisfied | Very Satisfied |
Based on your experiences with the materials, are there any changes you
would like to see made to the in-person lab experiment to improve its quality for
other students?
Major Revisions | Minor Revisions | No changes |
If you feel changes are necessary, please specify what changes you think should be made.
Evaluation of Student
Learning in Remotely Controlled Instrumental Analyses by Chris Meintzer, Frances
Sutherland, and Dietmar K. Kennepohl is licensed under a Creative Commons Attribution 4.0
International License.