Search This Blog

Showing posts with label assessment. Show all posts
Showing posts with label assessment. Show all posts

Tuesday, February 19, 2013

Writer's homework

When I read articles by people who didn't bother to investigate the subject properly, I really wonder, why are they getting paid for this? I just read a Guardian piece on the advantages and drawbacks of  massive open online course AKA MOOCs.  Iit includes an assertion that the courses only can be given for subjects that involve multiple choice tests: 
Moocs are limited to subjects that can be assessed with multiple choice exams, marked automatically.Written any essays in your degree? Your professor's critique of them can't be replicated by a mooc – yet.
First of all, MOOCs like Coursera have come up with a way around that, as I explained in a blog posted last year:
Another innovative aspect to Coursera is the way it assesses student work in courses that are not limited to technology or mathematics.
As founder Ng observes, "Multiple choice doesn't really work for a poetry class." Also, with thousands enrolled in a single class, instructors would find it impossible to personalize responses to student work.
Coursera's solution to that problem is the introduction of "a system for peer grading, in which students will be trained to evaluate each other's work based on a grading rubric provided by the professor." This is not all that different from peer reviews encouraged in writers' groups, which some teachers employ in their own classroom, though the Coursera system is designed to ascertain that the students comprehend the instructor's standards before being allowed to grade another's work.

 Second of all, there are already some systems to automate grading for written work as I explained here:

For example, Pearson’s Write to Learn is designed to offer instant feedback and personalized direction on student writing. The software can be accessed at computers in the school or through an Internet connection remotely. Teachers using the software are happy to have much of the grunt work associated with guiding students through revision and editing lifted from their shoulders.  The automated critique also reduces personal confrontations. As one teacher featured in a Write to Learn case study says, there's no "evil professor" who delights in finding fault in student work.
Educational Testing Service's e-rater is another automated assessment tool. It can score 16,000 essays in 20 seconds, a breathtaking rate of productivity when compared to the one to two minutes per essay typically allotted to human scorers.
Students responded positively when the New Jersey Institute of Technology introduced e-raters. An assistant professor there, Andrew Klobucar, observes that whereas students see drafting and revising multiple times as "corrective, even punitive," when assigned by evil professors, they do not have the same negative view when doing it for an e-rater.

I do wish writers would do their own homework when offering an opinion on the current state of educational technology.