Online Blogucation
28Dec/110

Plagiarism and Research-When does the teaching and learning stop and the cheating begin?

When I start out to write these blog articles, more often than not, I do not have a topic in mind. I find it interesting how sometimes, it just all comes together. Monday, on our lunch break, one of my colleagues mentioned how her daughter, who is in third grade, is learning how to take notes. This evolved into a discussion about learning how to research, take notes, and academic writing.

I consider myself lucky in this respect because I distinctly remember learning how to take notes and research in 5th grade, 7th grade and 9th grade. In 7th grade, I was enrolled in a zero hour- type class that was scheduled for a 20 minutes a day break. We used that time to conduct a year-long research project. We used the Reader’s Guide to find current articles on our topics. We used note cards for note-taking with the bibliographic details at the top, a direct quote on the front and a paraphrase on the back. I remember struggling with the paraphrasing. The fact that we used the same process in 9th grade helped a little bit but I don’t think I got a follow-up or re-teaching of the concept until I was working on my master’s degree. I fear many of my paraphrases in the many papers I wrote in high school and undergraduate work were more quotes with a few words changed rather than true paraphrasing.

Even with that scenario, I feel lucky that I had such scaffolded teaching on that topic throughout my K-12 years. I seemed to be way ahead of my peers when it came to academic writing in college courses. My colleagues at the lunch on Monday confirmed that some experienced direct teaching of researching, note-taking and writing while others just sort of figured it out. That is probably why, early in my teaching career when I taught 7th grade geography, I made them do a research paper and I graded all 125 papers for content, grammar and citations. It is only with practice, feedback and more practice that individuals learn to write academically and learn to correctly cite all sources.

So this discussion at lunch was then followed by reading an article in the Cornell Daily Sun about professors at Cornell University and their different perspective on using a plagiarism tool such as TurnItIn as part of the process for academic paper submissions. Reading the article, you can see that some professors do not like such tools because it says to the students that you know they are cheating/plagiarizing and you are going to catch them. It creates a relationship founded on mistrust from the beginning.

I actually disagree with this point of view and tend to agree more with Professor Peter Katzenstein who is quoted in the article. He says: “I don’t regard Turnitin as a tool for detecting or monitoring student plagiarism. It is, rather, a tool of great use to professors, graduate students and undergraduates for verifying authenticity and originality of scholarship.” Just like we need to teach our students how to research and write throughout their K-12 learning, that teaching needs to continue through their higher ed years. Earning a liberal arts degree, I completed many research papers in my 7 years in higher ed. I would have loved to have a tool like TurnItIn to help me to check my citations to be sure I have done it correctly. It is nice to have the tool available for each professor to decide when and how to use it. From a student perspective, I would like to have the tool. I don’t think I would view it as catch me cheating software.

I’m guessing the colleague’s daughter in 3rd grade who is learning how to take notes will probably be a pretty good academic writer by the time she completes her schooling. The more tools she has and the more opportunities she has will allow her to hone those skills. Don’t we wish all students arrived their freshman year of college with a good foundation in research writing? It is nice that we don’t have to use note cards and Reader’s Guides any more. The software opportunities are endless. Let us hope that our institutions see the value is such programs and makes them available for use by faculty and students.

Pamela Kachka, M.A.Ed.
Academic Trainer & Consultant

Cited Sources:

Purdue online writing lab--paraphrase: Write it in your own words. (2011). Retrieved from http://owl.english.purdue.edu/owl/resource/619/01/

Rathore, M. (2011, November 16). Professors differ on effectiveness of plagiarism software. The cornell daily sun. Retrieved from http://www.cornellsun.com/section/news/content/2011/11/16/professors-differ-effectiveness-plagiarism-software?mkt_tok=3RkMMJWWfF9wsRonvKTNZKXonjHpfsX56OwoXaKylMI/0ER3fOvrPUfGjI4ARcdiI/qLAzICFpZo2FFRCuGHfYRJ/fhO

Readers' guide to periodical literature. (2011, October 29). Retrieved from http://en.wikipedia.org/wiki/Readers'_Guide_to_Periodical_Literature

Turnitin--about us: Newsroom. (2011). Retrieved from https://www.turnitin.com/static/aboutus/newsroom.php

Filed under: eLearning No Comments
22Dec/111

Search and Rescue

Tree octopusI recently attended the Sloan C ALN Conference and watched an engaging plenary talk given by Howard Rheingold who discussed his idea that mastering “Crap Detection 101” is a necessary skill for students (or anyone) to have. This is always a relevant topic, but it was especially timely given that Howard was discussed in an article I read around the same time- a column titled “Why Johnny Can’t Search” from Wired Magazine (Thompson, 2011).

In addition to mentioning Crap Detection 101, Thompson mentions two interesting studies, including one by Professor Pan at the College of Charleston, where Pan measured how skilled students were at internet searching by using Google to answer a series of questions. Not surprisingly, Pan found that students relied on the top hits in Google, even when Pan had artificially changed the search results so lower results showed up first. Students were not verifying the quality of the search results they found, they were relying on Google to do this for them. Another study mentioned was conducted at Northwestern, where of the 102 undergraduates studied, none checked the authors’ credentials on internet sources they used (Thompson, 2011). My personal teaching experience aligns with these findings.

So why are researchers (and teachers like myself) finding these trends? Thompson suggests that schools aren’t teaching how to conduct intelligent internet searches, and more importantly, aren’t teaching students how to critically evaluate sources once they find them. It’s possible that a K-12 curriculum focused on prepping students for exams doesn’t include time for this type of instruction on information literacy, but then university instructors assume that students already know this information and so don’t focus on it in their classes. As Thompson comments, “this situation is surpassingly ironic, because not only is intelligent search a key to everyday problem-solving, it also offers a golden opportunity to train kids in critical thinking.”

Fortunately, there are plentiful online resources that help teach these skills (assuming you know how to find them in a search, ha ha), including lesson plans and sample activities. A useful method to use for website evaluation is the CRAAP test: Currency, Relevance, Authority, Accuracy, Purpose (originally developed by Meriam Library CSU Chico). Another fun way to approach this is to use spoof websites to help students learn that simply finding something on a website doesn’t make it truthful or reliable; a list of sites, including the online pregnancy test and save the tree octopus, can be found here. And finally, another valuable website (not just for students!) is Snopes.com which helps you identify the truth behind urban legends and misinformation (such as those email chains that go around- no, if you forward this to 50 people in the next five minutes, you will not receive a free computer). So let’s get started teaching students how to search!

– Gail E. Krovitz, Ph.D. –

Director of Academic Training & Consulting

Thompson, C. November 2011. Why Johnny Can’t Search. Wired Magazine. Available online at: http://www.wired.com/magazine/2011/11/st_thompson_searchresults/

14Dec/110

They’ve left us, but that doesn’t mean they aren’t still students

The National Student Clearinghouse Research Center released a Snapshot Report on persistence last week with some interesting new data on student persistence. To obtain a copy of the report visit their website at http://research.studentclearinghouse.org. According to the Research Center, "students were counted as having persisted if they: 1)remained enrolled in any postsecondary institution 60 days after the end of the term that included October 15, 2010 or 2) completed a degree within 60 days of the end of the term that included October 15, 2010.

The Research Center was able to identify students persisting in higher education regardless if the students remained at a single institution or moved among institutions. Accounting for this student movement, researchers found that overall, 84.7% of students persisted in higher education. Data were further broken down between full- and part-time status with 92.5% of full-time and 71.2% of part-time students identified as persisting. An examination of the persistence rates by type of institution attended revealed that the highest rate (91.4%) was found among students attending private, not-for-profit, 4-year institutions while the lowest rate (74.9%) was among students attending public, 2-year instititons.

These findings are encouraging as they show that while some students leave an institution before earning a degree or certificate, many continue their education at another institution. These "leavers" are typically viewed as drop-outs, an undesirable outcome from the institution's perspective. But, because of the data reported by the Research Center we can see that many of these students are, in fact, persisting but have just moved from one institution to another.

Institutions participating as data providers to the National Student Clearing House are able to use the data to help them determine how many of their former students are continuing at other institutions and can make adjustments to their own reports on persistence and completion. The data can also be useful to states and others who are interested in better understanding the enrollment patterns of today's college students.

The bottom line for those of us interested in seeing all students succeed is that the picture is not as bleak as our previous incomplete data on persistence would have us believe. And even more importantly, these findings suggest that students seem willing to continue their education even if, for whatever reasons, they have left one institution at some point during their education journey.
Kimberly Thompson

14Dec/110

They’ve left us, but that doesn’t mean they aren’t still students.

7Dec/111

Up In Lights

I just got back from performing a keynote address in Berlin at Online Educa.  It was an amazing experience.  Not only was the conference packed with over 2000 people, but the city of Berlin was quite breathtaking this time of year.  Everywhere you look in Berlin there is some kind of Christmas decoration, tradition, or ornamentation.  People gather together at the Christmas markets to drink Gluehwein (a spiced, boiled wine drink that smelled delicious) and sales abound in the shopping areas.

So as I was walking through one of the markets with some friends, I thought back to the decorating of my own tree just a few weeks ago, which led to thoughts of…instructional design!  (Seriously, I need a break).  With a four year old, Christmas came early this year and we had our tree up on Thanksgiving day!

But the lights on the tree, specifically, were quite an ordeal.  Actually they still are.  See, last year we bought a new tree.  We took our daughter down to “St Nick’s” Christmas store (no joke) and asked for a guided tour of the new trees.  While the trees look amazingly real, they ALL – 100% - had a major flaw.  It was impossible to buy a tree without pre-decorated lights!  And not just pre-decorated, but all white lights.  Ugh.

Of course, I get why they do it.  Most people hate lighting the tree.  It’s time consuming, you end up missing spots, and the only thing worse than getting them on is taking them off.  But, I knew then what proved to be true this year.  Pre-lit trees are not what they appear to be.  See, this year, I had happen EXACTLY what I asked the sales-elf about last year:

ME: “What happens if a light goes out?”
ELF: “That hardly ever happens!”
ME: “Okay, but what if it does?”
ELF: “Well, the lights aren’t connected like they used to be.  If one goes out, it doesn’t affect the others, it just goes out.  You can replace it or leave it, but the rest of the lights will shine.”
ME: “Riiiiiiiight….”

You can probably see where I’m going with this.  This year, just as I suspected, we got the tree up, plugged it in, and yep, you guessed it – the entire middle of the tree was black.  So, I got to spend about an hour, finding, unplugging, and re-plugging new lights into the old sockets, hoping each one would light the strand back up.  (I never got more than 4 in a row to light up with any new bulb…)

Alright, enough about my holiday nightmare.  So what does this have to do with Instructional Design?  Well, as I stood there checking bulb after bulb, I realized that some schools are taking this approach to their online courses.  The premise is simple:  Most instructors don’t have any education around teaching.  Instructional designers know how to design quality courses.  So, create a course with a group of designers and let a dozen different faculty teach it.  Done and done!

But, of course the analogy then starts to take over.  What if you allow instructors to change the course?  Some of those new courses will be awesome – amazing even!  Others, will be like a darkened bulb bringing down the outcomes average for the department.  What if it’s a blinking strand kind of course?  In other words, what if it has all kinds of whiz bang media and social interaction?  The answer there is that most faculty would need a boat load of instruction just to teach it.  (This is why most standardized courses don’t have cool stuff…they just have text, pictures, and some videos.  It’s easier to deliver, even though it’s not nearly as engaging for students.)  This straight forward approach to design for mass clusters of courses would be the equivalent of an all-white tree.  Guess what?  I don’t WANT an all-white tree.  That’s why last year I spent about 3 hours going through and changing out 4 out of 5 bulbs to a color.  I want color.  I LIKE color.

Ok you say - so let’s not use instructional designers.  Let’s let faculty design all of their own courses!  Guess what you get then?  You’ll get some lights perched perfectly on the limbs.  They will be unobtrusive, casting a healthy glow from the inside of the tree, almost as if the tree itself is on fire.  But you’ll also get…well, you’ll get the Griswald tree too.  You’ll get lights that look as if they were flung on the tree by a four year old with a slingshot, appearing as if they may fall off at any minute.  You will get some bulbs that are significantly dimmer than others.  You’ll get 5 reds in a row.  You’ll get classes that have nothing but text and no interaction with the professor except for an occasional rant and the final, posted grades at the end of term.

See, I’ve said it before and I’ll say it again.  There HAS to be a better way.  There has to be a healthy mix of instructional design, subject matter expertise, and personal touches that allow a class to be unique, engaging, and a quality experience in terms of assessment.  The school that figures out how to truly mix sound pedagogy with effective delivery and authentic assessment in a media rich, social environment will rule the world.

But until then, we’ll have to take it one light at a time.  We’ll have to create the best possible bulb section for our trees or try to create at least tri-color trees that are uniformed.  But one day…it will be different.

Oh, by the way, when I landed in Germany my daughter got on the phone.  She just HAD to tell me something.

ME: “Hey Peanut!”
ADDIE: “Hi Daddy.”
ME: “What’s going on sweet heart?”
ADDIE: “The middle of the tree is dark again Dad…”
ME: Guttural moaning...

Happy holidays and may your light shine brightly on whatever educational environment in which you teach.  Good luck and good teaching.

Dr. Jeff D Borden
Sr Director of Teaching & Learning

30Nov/110

Whom will the data serve? Thoughts on Usefulness and Portals for Education

As noted in the article Salman Khan: The New Andrew Carnegie? -

...knowledge no longer needs to be bound into the paper and cloth of a book but can float free on the wireless waves of the Internet. There’s a lot of junk bobbing in those waves as well — information that is outdated, inaccurate, or flat-out false — so the emergence of online educational materials that are both free of charge and carefully vetted is a momentous development. This phenomenon is all the more significant given the increasing scrutiny directed at for-profit online universities, which have been criticized for burdening students with debt even as they dispense education of questionable usefulness. Websites offering high-quality instruction for free are the Carnegie libraries of the 21st century: portals of opportunity for curious and motivated learners, no matter what their material circumstances (Paul, 2011, para. 6).

I pursue the goal of excelling as an engineer or architect of learning and to be otherwise associated with the proliferation of "portals of opportunity for curious and motivated learners, no matter what their material circumstances" (Paul, 2011, para. 6). In some sense, I am these things already as an Academic Trainer and Consultant with Pearson eCollege. If I had a personal mission statement, it would be worded similarly and my destiny would be to serve in an industry associated with or embedded within the systems of education.

Yet, that’s not the point of this post!

I found it interesting the Paul (2011) article quoted above suggests the phenomenon of high quality online vetted materials "...is all the more significant given the increasing scrutiny directed at for-profit online universities, which have been criticized for burdening students with debt even as they dispense education of questionable usefulness."

Could not many of us argue that public colleges and universities also "dispense education" of "questionable usefulness"? Actually, many might also debate whether education is dispensed or received or shared or…

Wait, that’s not the point of this post either!

So, what is the point you ask?

The point is to consider critically the reality that all colleges and universities - regardless of profit motive or mission statement - are justifiably susceptible to this questioning of usefulness. Knowledge and skills needed for professions and trades evolve quickly in part due to the globalization of knowledge and virtual removal of barriers to access to information through the internet for a large portion of the world’s population, but certainly not all of that population! Let's question some things...

Could we argue that a nursing or teaching degree in the United States from 1990 is as useful today in the same locale as one from 2010? Does locale matter? How does that impact usefulness?

Does on the job real-world apprenticeship style workflow-learning add value to the formal education received? If yes, how is that measured?

Does a graduate's lack of continued professional or personal development post-graduation to become or remain productive in the workforce as laborer or entrepreneur necessarily reflect negatively on the value of educational portals provided by a college or university?

Yes, that’s the point.

While there is much that can be unpackaged from the messages of the selected quote opening this post, the point of this post is to ask you to think critically about what we are measuring when we refer to educational usefulness, how we are measuring it and defining the variables associated with the measures, and ultimately why we are measuring it – whom will the data serve?


Lisa Marie Johnson, Ph.D.
Academic Trainer & Consultant
Pearson eCollege

Reference

Paul, A.M. (2011, November 16). Salman Khan: The new Andrew Carnegie? The emergence of free, high-quality online courses could change learning forever. Retrieved from Times Online MagazineIdeas section (link opens new page): http://ideas.time.com/2011/11/16/salman-kahn-the-new-andrew-carnegie/

23Nov/112

What Constitutes Teaching in an Online Course?

A recent article in The Chronicle of Higher Education, “Students of Professor Who Didn’t Show Up Keep Their A’s and Get Refunds, Too,” caught my attention a few weeks ago. Initially, as I read the headline, I assumed that the professor in question had not showed up to teach her on-ground course. As I read the article, though, I realized that the instructor in question was fired for not teaching an online course: the third paragraph of the article states that “Venetia L. Orcutt, department chair and director of the physician-assistant-studies program” at the George Washington University School of Medicine and Health Sciences, was “assigned to teach a sequence of three one-credit courses in evidence-based medicine over three semesters last year. The first semester of the required course was face to face, and she showed up for that. But according to three students who complained to the university’s provost, Ms. Orcutt went missing when the course sequence shifted online” (Mangan 2011). The article does not explain which of her teaching duties Ms. Orcutt failed to carry out (nor could I find this information in the two other articles I could locate on the case, which seems to have received relatively little attention); it says only that she “went missing” and then assigned all the students in the course A’s.

How exactly does an instructor “go missing” in an online course, I wondered? How much negligence constitutes not teaching an online course? For instance, if you don’t participate at all in threaded discussions, does that constitute not teaching your online course? What if you answer questions in the threaded discussions but don’t grade any student work, or fail to keep your gradebook up to date?

These questions may seem overly simplistic, and it does seem logical that a university firing an instructor for “not teaching” an online course would probably result from a combination of all of the above duties: no appearance in the threaded discussions, no grading of student work, no answering of student questions. But I wondered whether this case might open the door for institutions to decide, for instance, that online instructors who fail to respond to students in the threaded discussions may be in breach of their contracts with the institution and thus at risk of losing their jobs, or that instructors who don’t grade student work should be fired. This question was particularly interesting to me given recent feedback from some of my own online students. I always use the last threaded discussion of one of my online courses to solicit feedback from students about what worked well in the course and what didn’t; this term, which just ended last week, I had for the first time students telling me that they really appreciated my doing things that I consider pretty basic aspects of my job: participating in the threaded discussions, responding to their questions punctually, being available for extra help over email, using the course tools available in my institution’s LMS rather than email to accept student papers, and keeping the gradebook updated.

Coupled with this anecdotal evidence from my students, I have seen firsthand a variety of online courses that demonstrated exactly the shortcomings my students were complaining about in their other online courses—courses in which there is no record of students submitting assignments, much less the instructors grading it; threaded discussions where the students post regularly, for up to eighteen weeks, with no response from the instructor; gradebooks that are either empty, incomplete, or inaccurate. Such behavior would not be tolerated for long, I suspect, in most on-ground courses; what makes it acceptable in online courses?

The short answer is that it is not acceptable. There are two problems, though, that make such occurrences more common in online teaching. The first and largest of these is that many online instructors don’t receive adequate training; they don’t know how to use the tools in their LMSes and thus lack basic knowledge that would facilitate the teaching of their courses online. This is an institutional problem more than it is an individual problem. Certainly, there are online instructors who persist in believing that online teaching is somehow secondary to on-ground teaching, and that they need not actually do the same kinds of work in their online classrooms, but most online instructors, I think, want to do a good job and sincerely believe that they are. They simply haven’t received the training they need to make full use of the tools available to them, nor do they understand that there are strong pedagogical reasons for maintaining an active presence in an online course. The second problem is that it is much more difficult to measure how long online teachers spend teaching, particularly if those online teachers aren’t using tools in the LMS that record activity for the time spent using these tools. If, for example, instructors ask students to submit papers via email rather than using a dropbox tool in the LMS, it’s more difficult (not impossible, but more difficult) for those instructors to show that they’ve collected and graded those papers. If instructors are primarily interacting with their students over email, not responding to or facilitating threaded discussions in their online courses, it is again more difficult to ascertain how long these instructors are spending actually teaching and interacting with their online students. This second problem thus relates to the first: these instructors haven’t received really basic information about why and how they should use their LMS’s tools to teach their online courses. The good news is that both problems are easily fixed: institutions need to make adequate training, both in terms of basic instruction for online teachers about how to use their LMSes and in terms of the pedagogical principles of online instruction, their first priority.

Jennifer Golightly, Ph.D.

Academic Trainer & Consultant

Works Cited

Mangan, Katherine. 2011. Students of professor who didn’t show up keep their A’s and get refunds, too. The Chronicle of Higher Education (November 9). http://chronicle.com/article/Students-of-Professor-Who/129709

Filed under: eLearning 2 Comments
16Nov/110

Google Voice for Teaching Presence & Community

Abounding research has confirmed for us the power of instructor presence, immediacy and feedback in online courses. Conceptually, “immediacy” refers to behaviors that lessen the “psychological distance between communicators” (Weiner & Mehrabian, 1968). In practice, and in particular as it applies to online learning and course delivery, the behaviors and practices that generate and sustain instructor immediacy must often occur in scenarios of total or varying degrees of physical distance and separation of course participants. The power to connect with students who we do not physically see or meet with in person can have much to do with the resources available to us to power open lines of communication.

For online instructors, a critical need is to have the capacity for communication, just as we would have in a face-to-face classroom. Instructor-to-student communication around the delivery of content, course information, grades, assessment feedback, etc as well as the need for student-to-student communication around course concepts, discussions or group work illuminates the need for a wide range of tools to facilitate communication exchanges.

How do you foster immediacy and the communication you need to have with your online students?

If you teach with the Pearson’s LearningStudio system, you might use the Announcements and Email tool for asynchronous communication with your students. You might also be using the Threaded discussion tool for targeted areas of communication such as an instructor virtual office, a weekly topical discussion or dedicated group work areas. When synchronous communication is preferred, you might also use Classlive or the Chat tool. All of these measures share the function and outcome of providing means through which communication, interaction and dialogue can occur.

In addition to in-course tools for communication such as those briefly highlighted earlier, the greater Web community offers a wide range of tools you and I can use to generate channels of communication. In this post, I’ll highlight my most recent experience with Google Voice as a communication tool in my teaching.

Google Voice isn’t a new feature from the well-recognized Google family, but it is one that is still unfamiliar to many and is being continually improved. Google Voice offers key functionality you may consider to be helpful to you in communication with your students.

What is Google Voice?

Google Voice can be used to enhance the existing capabilities of your phone, regardless of the phone or carrier you might use. Key features include:

  • One Number: Use a single number that rings you anywhere.
  • Online voicemail to your inbox like email.
  • Transcribed messages.
  • Free calls & text messages to the U.S. & Canada.

Directly from the welcome email I received upon signing up, here is a brief highlight of Google Voice and few of its features:

“Welcome to Google Voice. Google Voice gives you a single phone number that rings all of your phones, saves your voicemail online, and transcribes your voicemail to text. Other cool features include the ability to listen in on messages while they're being left, block unwanted callers, and make cheap international calls. We hope you enjoy using Google Voice.”

Your inbox, messages, features and settings can all be accessed and customized from your Google Voice Page (google.com/voice).

Here’s how I am using Google Voice with my students

So, here is a brief overview of my current setup and a few of the ways I use Google Voice to enhance the phone communication channel with my students. First, I did some research on Google Voice before signing up and discovered the ability it would provide me to essentially bridge my current phones and receive voicemails all in one place- my Google Voice Inbox. I decided to give it a try!

During the sign-up process, I was prompted whether I wanted to use Google Voice with my existing mobile phone number or a new phone number from Google. I selected to create a new phone number figuring I could later choose whether I would actually use it. As it turns out, having a Google number, which is free, opened up additional possibilities for me in leveraging Google Voice for phone communication with my students. With a Google number, I was able to now provide my students with 1 phone number where they could call me and I set parameters around what occurs when a student calls that number, such as which of my actual phones (home, mobile, etc) ring when my Google number is called. I was also able to set when my phones can & cannot ring, and the greeting that students hear when I don’t answer their call immediately. Awesome! (More on this later in this post).

Having a Google number has also meant I do not need to provide students with my personal mobile number or home number, but can set Google Voice to route my calls to those personal telephones, if I choose to. When I am not available to answer their call, students hear my personal greeting and are able to leave a voicemail message. That message then arrives in my Google Voice inbox as transcribed text that I can actually read prior to or in place of listening to the voicemail message.

In this example, one of my students called my unique Google number. I wasn’t available to answer at the time of the call, so my student was routed to my personal greeting message and left a voicemail for me. Within seconds, I received the voicemail alert in my Google Voice inbox and was able to read what the student spoke in the voicemail message. The alert in my inbox displays the number from which the voicemail was received, the date/time in which it was received and a transcribed version of the voicemail message. Pretty cool!

Transcribed Voicemail Message Image

(Click on image to enlarge) Note: “Sample Student” and “(123) 456-7899” have been layered over the original name and number.

A few days after the original setup, I spent a bit of time exploring the Settings area within the Google Voice page and noticed several handy features that could help me even further. As mentioned earlier, I was able to schedule when my phones would be able to ring with a call and when they should not. This means I don’t have to worry about hearing my phones ring when a student calls me in the middle of the night. :-)  This is particularly useful when your online students span the country and/or the globe and could be calling you from a wide range of time zones!

Within settings, I also noticed the ability to elect to have text message alerts of new Google Voice activity sent directly to my cell phone. I went ahead and signed up to receive a text message alert on my personal cell phone whenever a student leaves a voicemail message at my Google number. (Keep in mind this is something you can change or cancel at any time). For me, this handy feature has meant I am automatically alerted of a student’s voicemail message even if I am not in my Google Voice inbox or even on my computer. Below is a screen capture of what the text message alert looks like on my cell phone. Notice the transcription of the voicemail message directly within the text message alert on my screen, giving me the ability to ‘see’ the content of my student’s voicemail message without having to be on my computer, log into my Google Voice account, or even ‘listen’ to the voicemail message:

Image of Text Message Alert of a Voicemail in Google Voice

(Click on image to enlarge) Note: “Sample Student” and “(123) 456-7899” have been layered over the original name and number.

I’ve certainly noticed instances in which the transcription of a voicemail isn’t complete. When this happens, I am able to listen to the actual message to hear my student’s recording directly. I am also able to alert Google of when this occurs and even “donate” the transcript for improvement of Google Voice in the future.

As you’ve probably figured out by now, having a Google number and Voice account means I can also receive text messages from my students, free of charge. Now, I know some of us as instructors are not sold on text messaging with our students. And you’ll find the features of this tool that are helpful to you and you can use those. I’ve found adding a Contacting your Professor announcement in my online course and including within it several parameters for communication and my expectations for students can really help clarify the way I encourage my students to communicate with me, minimizing issues of appropriate methods of contact, etc.

When a student does send me a text message (which I’ve noted in my announcement as an acceptable form of communication), the text message arrives in my Google Voice inbox, just like an email would. It is organized under “Texts” and I have elected to have alerts of new text messages to my Google number sent directly to my cell phone. The alert includes details of the message as well as the text itself, much like the earlier screen capture. And I can choose to respond with a text message, a phone call, or any other method I’d like to use.

Additional thoughts on Google Voice for Communication and Community

I’ve been using Google Voice for some time now and it has become a frequently-used tool in my toolkit as I seek to build effective communication with my students. I’ve also been able to refer my students to sign up for their own Google number to be able to make free nationwide calls to classmates throughout the course and to team members during a group project. All the student messages I receive, including voicemail messages and text messages, are stored within my Google Voice inbox and I can access them from anywhere I can access the internet. I can choose to delete messages once I’ve responded to the student or keep the message stored in my inbox, which means I no longer have to worry about time limits for the storage and retrieval of student’s messages. As a robust phone communication tool to support what I am already doing in my online courses and the ways I seek to foster immediacy, Google Voice is certainly worth a try.

Here are some resources to get you started

If you’re curious about getting started with Google Voice, take a look at this brief introductory video: What is Google Voice? or visit the Google Voice YouTube channel to see more. To sign up, go to google.com/voice. Click “Try it Out” to sign up with a current Google account or to create a new Google Account. Once your Google account has been created and verified, click on Voice logoand follow the on-screen instructions to get started. Again, you’ll have the option of using Google Voice with an existing mobile phone number or creating a brand new (and free) Google number.

By the way, if on-the-go advanced calling and voicemail functionality is a welcomed addition for your communication toolkit, you can download the Mobile App, which gives you access to your Google Voice inbox and messages right there on your mobile phone. Pretty handy if you like to stay connected while mobile! With the app, I am able to call students or send them text messages that appear as though they are coming from my Google phone number even though I am actually sending them from my personal mobile phone. Pretty cool stuff! Plus, it’s totally free to do within the U.S. and Canada. Currently, the app is available for Blackberry, iPhone and Android powered phones. For more information or to download the app, search for “Google Voice” in your app marketplace.

Until next time, I wish you the best in your courses. Be sure to share a comment with us via this blog if you use Google Voice or another handy tool for communication with your students!

Rachel Cubas
Academic Trainer & Consultant
Assessment & Analytics Group | Academic Training & Consulting Team (ATC)
rachelc@ecollege.com

References:

Wiener, M. & Mehrabian, A. (1968). Language within language: Immediacy, a channel in verbal communication. New York: Appleton-Century-Crofts.

9Nov/110

Whaddya mean, it’s free!?

In case you missed it, Pearson made an announcement a few weeks ago followed by some serious marketing during EDUCAUSE about one of our newest products, OpenClass. OpenClass is “breaking down barriers and transforming the learning environment,” says Adrian Sannier, Pearson’s Senior V.P. of Learning Technologies. Why? Because, in short, it’s free. As in really free – it’s hosted in the cloud, so there are no hosting costs. There are no licensing costs. In fact, if you’re a school with Google Apps for Education you can start using it right now. Free. Just click here.

But my post today is not to talk about OpenClass directly. I’m not going to try to sell you on it or demonstrate it or talk about all of its amazing features. Instead, I want to talk about some of the reactions we’ve seen to OpenClass. It’s my contention here that we in the educational world (and in our consumer culture in general) have become so suspicious of the word “free” that we can’t possibly believe that something really could be free. Once bitten, twice shy, right?

The Chronicle of Higher Education and Inside Higher Ed both wrote articles on OpenClass following Pearson’s press release. Fine articles – balanced points of view, a few questions that need to be answered, and so on. And kudos to them for taking on these national discussions on the idea of a free LMS.

The heart of the matter, though, comes down to the comments posted by many of those publications’ readers. I’m going to share a few of those comments here, not to poke fun at the authors or to say that they’re wrong or that there’s anything wrong with what they posted. I’m all for free speech and for potential end users to challenge Pearson to deliver on its promises. What I want to point out is the amount of cynicism we see in the world of Learning Management System adoption. Let’s start with a sample of comments (unedited):

  • free-hosting sounds great, ...but at what price? what sorts of idiosyncrasies and limitations will this cloud-based LMS have?
  • I can almost picture the pop-up ads in OpenClass--"wouldn't you love to be able to [insert Learning Studio feature not present in OpenClass] ?"
  • Nothing is free!
  • this may not be as “free” as it looks. For a campus to integrate an LMS into their academic mission, it takes time, money and cooperative relationships with faculty.
  • I question how free OpenClass really will be. Pearson is a for-profit publisher and, to use OpenClass, I suspect they will have customers to use their textbooks under the guise of an integrated learning platform…I sense there are many strings attached to this so-called free platform
  • Like other people, I’m also wondering how “free” this can really be. LMS adoption is a costly process -- in terms of time and money. Plus, a newer LMS is bound to have more problems than better-established LMS that have been evolving over a decade or more.
  • While it is nice that "free" (as in gratis) is referenced, it is certainly not Free (as in libre). Of course, one does have to wonder how long the "free" part will last...
  • Can we please define "free?" It seems very limited to think of costs only directly related to hosting the application(s) and maintaining the hardware. Is it free in the sense that open source software is free (e.g. free as in speech vs. free as in beer)?
  • Pearson could cancel OpenClass at any time, or not fix bugs or insert ads or just stop adding any features or upgrades, and there is nothing anyone can do about it - you're locked in.
  • Yeah..it's good.But would you mind if i ask you a question? well I am 31 years of age from Tanzania East Africa I am looking for a sponsor for  my master's degree any where can you help please..

Okay, maybe that last one’s not really on point. But I think you get the picture I’m painting here: people are surprised, suspicious, and even (at times) hostile toward the idea of a free LMS. Several readers/commenters act as if Pearson is a drug dealer, using OpenClass to give people a taste, getting them hooked, and then causing them to take out second mortgages on their universities just to stay in the LMS.

Not true! Look, I’m biased here. I see that. I work for the company that makes OpenClass. But I’m also an academic, have years of teaching experience at the university level, and have years of experience with a variety of LMSs. So I know where these readers are coming from. Nothing is ever as free as it seems, right? There are always hidden costs. We, as a consumer culture, have become desensitized to the word “free” because, as one commenter so astutely wrote above, “Nothing is free!”

For example, there are a lot of other LMS offerings out there that purport to be free, but limit you in terms of the number of courses you can create or the number of students you can enroll. OpenClass is not that. There are other LMSs that provide the backend code for free so that you can essentially create your own LMS using their code. Except that you have to pay for hosting – even if that means just a lousy few thousand dollars on some servers and routers. OpenClass is not that.

In short, these other “free” offerings have brought out the cynic in many of us, that anything that says it’s free can’t possibly really be free. We’ve been burned too many times before.

OpenClass is free. You can get it for free out of the Google Apps Marketplace, create a course, enroll students, and run with it. You can create ten thousand courses with 100 students in each. Let your imagination run wild. It’s free. If you still don’t believe me, try it out. Let me know what you think.

Rob Kadel, Ph.D.
Pedagogy & Training Group Supervisor
Academic Training & Consulting

2Nov/110

The Buzz on Assessment

I had the pleasure of attending the 2011 Assessment Institute in Indianapolis this week. The conference is the nation’s oldest and largest event focused exclusively on outcomes assessment in higher education. Administrators, Faculty and Student Affairs professionals convened this week to discuss techniques and approaches across outcomes assessment areas. This year, the event featured tracks on Capstone Experience, ePortfolios, and Faculty Development, among others.

I’d like to share with you a few of the recurring themes I heard and will take with me from the keynotes, workshops and best practice sessions. I will share specifically three themes and considerations. These few points may serve as a marker for some of the noteworthy issues and considerations in the higher education outcomes assessments landscape.

The first two themes are indeed linked in both process and practice, so I will identify both of them at this point. They are: 1) Faculty Engagement and 2) Using Results to Inform Improvement Processes. For those of us who have been doing outcomes assessment for any extended period of time, these themes may echo many of the questions and issues as well as the successes we have faced.

The engagement of faculty in the assessment process is certainly not a new issue in the practice of assessment. Notwithstanding, faculty engagement in the process of outcomes assessment is a reality many institutions are still desiring and even stretching to achieve. The corporate understanding among practitioners gathered at the event appears to reveal an arrival, or perhaps a standstill in some cases, at a place of resounding confirmation, one that points to faculty engagement in the assessment process as a critical component to successful assessment. In her 2010 paper entitled “Opening Doors to Faculty Involvement in Assessment”, Pat Hutchings wrote:

“As Peter Ewell (2009) points out in another NILOA paper, from its early days in higher education, assessment was “consciously separated from what went on in the classroom,” and especially from grading, as part of an effort to promote “objective” data gathering (p. 19). In response, many campuses felt they had no choice but to employ external tests and instruments that kept assessment distinct from the regular work of faculty as facilitators and judges of student learning. In fact, the real promise of assessment—and the area in which faculty involvement matters first and most—lies precisely in the questions that faculty, both individually and collectively, must ask about their students’ learning in their regular instructional work: what purposes and goals are most important, whether those goals are met, and how to do better. As one faculty member once told me, “assessment is asking whether my students are learning what I am teaching.”

Further, the notion was submitted that seeking faculty engagement should not be seen as a one-time achievement but as an ongoing and evolving effort that characterizes a campus assessment strategy. Inasmuch as the issue is not a new one for assessment, the corporate sentiment among conference participants is that garnering this engagement remains a key dynamic and often great challenge. Several presenters admonished institutions represented at the conference to engage in cross-institutional dialogue to share strategies on how to foster a deeper degree of faculty engagement.

The second recurring theme centers on a question of the value, strategy and purpose of assessment efforts, asking What’s it all for? Assessment is hard work. And the growing sentiment appears to be a desire to see campus assessment efforts translate into actual impact on student learning, beyond the collection of data and documentation for accreditation and/or certification. This pull for results that impact student learning is a call to move beyond data collection and planning of assessment to the informed and strategic improvement of teaching and learning based on the data. To make assessment more useful, we must include within our strategy an intentional approach to leverage data and documentation to help bridge the gaps between our current and improved realities. This process must be ongoing. And it undoubtedly must include faculty.

Finally, the third takeaway comes in the form of a resource. The National Institute for Learning Outcomes Assessment (NILOA) had a strong presence at the 2011 Assessment Institute. Several of the organization’s staff and associates were keynote presenters and include a notable group of internationally recognized experts on assessment. NILOA presenters pointed conference participants to what they called the ‘crown jewel’ of the organization’s efforts, a recently-enhanced and robust website featuring a collection of papers, articles, presentations, websites and survey results compiled in alignment with the organization’s vision for discovering and adopting promising practices in the assessment of college student learning outcomes. Reviewing the organization’s website will quickly reveal its valuable contribution to the field of assessment and current issues, including those I’ve highlighted from the conference. Take a moment to explore this great resource by visiting www.learningoutcomeassessment.org.

It was certainly a rich experience to attend the conference and have the opportunity to share with institutions and hear the collective voice of higher education assessment practitioners.

Rachel Cubas
Academic Trainer & Consultant
Assessment & Analytics Group | Academic Training & Consulting (ATC)

References

Hutchings, P. (2010) Opening Doors to Faculty Involvement in Assessment. National Institute for Learning Outcomes Assessment.