This may or may not become a series depending on how much bile I have left over after the term ends. If you’re lucky, it won’t be a series.
It being 2007 and everything, I figured there must be an easier way to manage multiple choice exams than to put all the questions in a word processor document, go through and make sure there’s enough variation in the answers, and then copy and paste into a final exam. There must be a database out there that would allow you to compile questions by category, edit and format them, randomize the answer order, and maybe even randomly select from each category for different forms of the test.
I called McGill’s support people and was informed that such capability existed inside WebCT, the “courseware” we use. However, to create a paper test I’d need the companion product, Respondus, which only runs on PCs. Luckily, there was a educational tech designer willing to work with me, and so, I was assured, at the end I’d have a nice, powerful database of questions. That may in fact be the case, but I can tell you that for the “convenience” of software to “save labor” in constructing exams, my TAs, the kind staff person and I spent many more hours than if we’d just written up questions and copied and pasted.
1. The interface for entering questions on WebCT is needlessly complex and surprisingly slow. It does not play well with text from word processors, which means you need to sit online to write your questions or deal with formatting headaches, and simple options cannot be set to defaults (or, if they can, there is no obvious way to do it), meaning that you must be vigilant to make sure answers are randomized, that the answers use letters instead of numbers, etc.
2. The interface for transferring questions from WebCT to Respondus is also painfully slow. Once transferred, they need to be reorganized AGAIN in order to be arranged into an exam. Also, new errors showed up in the questions. All this took so long that the kindly person with whom I was working realized that there was no way a professor and dept chair had time to sit around messing with this thing and so she spent her time doing it for me. I imagine it added up to about a day of her time, give or take.
3. For the initial output of my exams, most questions for all 4 forms had an answer of “A.” Although you can randomize the order of answers online, Respondus intially wouldn’t do for paper exams. Because, on paper exams, OF COURSE you’d want all the answers to be “A’. Someone at McGill had to write a script in order for the exam creation software to create an exam that would be worthwhile.
4. After all that, the final result is poorly formatted and hard to read, so I can either go in and reformat it myself, or give the students a test that’s difficult to read because answers aren’t properly indented from questions.
So, in the end, it took a bunch of extra hours (and a delay of a week) to “save” me the “effort” of using a word processor to design the multiple choice section of my final exam. And my students received an inferior product as a result because I didn’t have the time to manually reformat four exams at the last minute.
—-
I would like to say that this surprised me, and I guess it did but I should have known better. Every interaction I’ve had with WebCT has been counterintuitive, to use a word from interface-design, and the application is an incredible time suck, even just to enter grades online. The course websites are ugly to look at and their organizational jargon only makes sense if you teach in exactly the way they’d llke you to. Students have trouble finding things for the course, and it takes waayyy too long to do simple course management tasks. Did I mention that it commits the design sin of using popups? Lots of popups. If it weren’t such a hassle for the students, I would seriously consider going offsite.
Ahhh. I feel better now. I feel like there’s more to say, but I had to get this off my chest since I finally got the exams in today, a week late (you have to submit them to a central office for copying and invigilation), and had a last minute scramble when I discovered even more messed up formatting.
I had to laugh when I read your blog and really had to write an answer to that 🙂 I am from the University of Zurich and we use an open source LMS called OLAT (Online Learning And Training, see http://www.olat.org). In OLAT you can:
– Create a database with questions and than randomly display a subset of it to each student (all tests are based on the QTI standard)
– We have here professors that created a database with 1000 or more questions in OLAT and for their tests each student gets a randomized subset of eg. 20 questions presented (time limited)
– The interface to create tests is webbased and pretty fast (read also below about QANT)
– You can create Excel files with all the results and have a nice overview over all the results.
– OLAT looks nice and clean not as ugly as webCT 🙂 (see http://demo.olat.org for a demo server)
For more elaborate tests we use a tool called QANT (http://www.focusedpublishing.com/products/products.html). This tool allows to create a pool of questsions and create tests out of it, create PDF versions etc. Via the QTI standard you can import theses tests into OLAT.
Hope this helped…
Wow… Consider this… The version you are using has made improvements over the one we are using.
For whatever it’s worth, here at IU we use a system called “Oncourse.” It’s part of an open-source course-ware project called “Sakai.” I hardly use it, except for communicating with my students en mass via email. Still, I gather it’s functionality is fairly good overall–and on top of that, IU isn’t indentured to a private (and monopolistically inclined) course-ware company like Web CT. So there you go.
EZ, I feel your pain.
As to an open source alternative, I would love nothing more. Alas, it is not my decision to make, and it sounds like the upper admin at McGill is wedded to WebCT for now. The question is what I will choose to do next year.