Online Blogucation
4Dec/120

Do Web 2.0 tools really work?

In my work, I help higher education instructors to hone their online teaching capabilities. I've not had a class go by without someone posing the question: do Web 2.0 tools really work? Their hesitation is warranted: they are bombarded daily with proclamations of the Web 2.0 as the education magic bullet. Is it? Or is it just another education fashion that will fade away as fast as it came? There is actually a dearth of rigorous research that supports such emphatic rhetoric. But I thought that there might be some and I set out to find it.

Before I tell you about the results of my literature search journey, let me tell how I embarked on it. The variety of Web 2.0 tools and the amount of educational variables are humongous and hence to expect to cover it all in just one blog post was preposterous. I had to be selective. I wanted true comparisons; that is, studies in which there is control of variables and in which at least two groups received comparable instruction. In other words, I excluded studies where one group used Web 2.0 tools while the other did not receive supplementary instruction as well as studies in which there was no pre-treatment measurement. Further, I selected quantitative studies because I was interested in measures of student achievement. Studies that focused on students' engagement and perception and qualitative studies were also deselected. Moreover, studies in which the timing of the treatment (but not the treatment itself) varied were also excluded. Finally, I focused on higher education and typical academic core coursework published during the last three years. Let me tell you though that these constrains almost put me at the verge of ending this blog right here due to lack of studies, so I refrain from any further limitation.

The first study I found was Malhiwsky's (2010) Ph.D. dissertation. The study compared the achievement of community college students in the US who were taking ten-week long online Spanish classes. All students had to complete assignments related to authentic practices in language learning but one group did so by producing podcasts and videos while the other group did not use Web 2.0 tools. The two groups did not show any significant difference in pre-test scores but post-test scores for the Web 2.0 group resulted significantly higher than those of the non-Web 2.0 group. The author also implemented the Classroom Community Survey, which revealed a higher level of classroom community in the Web 2.0 courses. Interestingly though, both groups expressed the same level of self-reported learning, i.e. both groups had a similar perception of the extent of their learning.

Another study conducted by O'Bannon et al. (2011 and personal communication: email 11/28/2012) compared pre-service teachers' achievement when visually enhanced podcasts were used instead of lectures in a semester-long mandatory technology course at a US university. All students used a learning management system to submit assignments, participate in discussions, take quizzes, and access course resources. They also read textbook chapters, watched instructor's demonstrations, participated in hands-on practices and develop a project. Half of the students attended lectures taking notes on the slideshow handouts but the other half received the podcast treatment. Again, pre-test scores were not significantly different but post-test scores were significantly higher for the podcast group than for the lecture group. Students tended to like the podcasts and felt that they were reasonably effective but they disagreed that podcast should replace lectures.

Su et al (2010) conducted a quasi-experiment during an introductory computer class in Taiwan. The control class used a Web-based discussion board that did not include the article under discussion. The experimental class used a Web 2.0 collaborative annotation system that included the article and students could add mark ups on it. The collaborative tool also had annotation searching capabilities and could host discussion forums. Pre-test scores (yes, you guessed it) showed no statistically significant differences between the classes. During the fifteen-week long course, students received three low stakes tests plus a mid term and a final. The first test showed no statistical significant differences between the classes, which the authors attributed to lack of experience of the experimental group with the Web 2.0 tool but in the other two low stakes tests the experimental group showed scores that were statistically higher than those of the control group. For the mid-term and final exams the two groups did not show statistically significant differences, which the authors argued was due to more "catching up" done by the students in the control group.

In my quest, I did find a few other studies (with emphasis on "a few"). On the way, I read very many studies and concluded that while I still do not believe in magic bullets, Web 2.0 tools do a great deal when properly implemented. And this is not said lightly because they do present its challenges. Yet, I am still to find an article that describes a negative effect and in the overwhelming majority of cases the effect was positive. Sometimes, it required more training than initially thought, both for students to become familiar with the tool and for instructors to learn how to teach with it. I also encountered that as effective as Web 2.0 tools might be, students do not see them as replacements but rather as additions. Thus, they do not necessarily feel that they learn more when they use Web 2.0 tools and they may just express a preference for them, although test scores may vouch for an actual learning effect. This last point may talk to a smooth integration of Web 2.0 tools and instruction. Students using Web 2.0 tools may be learning more easily, without much catching up to do for high stakes exams. From my readings, I also formed the impression that the most salient value of Web 2.0 tools in learning is its collaborative nature: the fact that students feel the responsibility of being a member in a learning community that is to produce an authentic product for authentic audiences and do so in a 21st century kind of way.

Yes, there is one caveat here: all the studies I reviewed were limited in duration and may have suffered from the novelty effect, i.e. the tendency for individuals to initially increase their performance when new technology is involved because of increased interest in the new technology itself and not necessarily in the content it delivers. But until we have longitudinal studies and until a new contestant comes up to this arena, we have to go with what we have and by now this juror is out: Web 2.0 tools do work.

Laura Moin, Ph.D.

Academic Trainer & Consultant, Teaching & Learning Group

References

Malhiwsky, D. R. (2010). Student Achievement Using Web 2.0 Technologies: A Mixed Methods Study. Open Access Theses and Dissertations from the College of Education and Human Services. Paper 58. http://digitalcommons.unl.edu/58

O'Bannon, B. W., Lubke, J. K., Beard, J. L. & Britt, V. G. (2011). Using podcasts to replace lecture: Effects on student achievement. Computers & Education, 57, 1885-1892.

Su, A. Y. S., Yang, S. J. H., Hwang, W.Y, & Zhang. J. (2010). A Web 2.0-based collaborative annotation system for enhancing knowledge sharing in collaborative learning environments. Computers & Education, 55, 752-766.