This is the blog for the SGC4L project, funded from the JISC Assessment and Feedback programme and led by the Physics Education Research Group at the University of Edinburgh.

As well as this blog, the project wiki contains documents and information on the progress, development and dissemination activities associated with the project.

Friday, August 3, 2012

PeerWise webinar in the JISC Assessment and Feedback series

On 23 July 2012 we gave a webinar "Enhancing engagement, feedback and performance: the SGC4L project" as part of the JISC Assessment and Feedback series. As people who have used PeerWise know, by far the best way to appreciate its potential is to use it yourself. So the webinar included an interactive session offering participants the opportunity get first hand experience of interacting with others via a specially created PeerWise course area. This was prefaced by an initial presentation outlining what PeerWise is and how to access the course space. We concluded the webinar with a presentation and discussion of how the system is being embedded within courses at Edinburgh and some of the findings from SGC4L project.  You can watch the webinar at the link below:


 

Wednesday, March 7, 2012

Qualitative categorization

Our two final year undergraduate project students are making great progress in categorising samples of the PeerWise questions created by students in our first year Physics class.

They've developed and refined a range of information to be captured from each question authored by a student as follows:


Clarity of Questions:  - 
0 – Unclear (including spelling & grammar that make question unclear)
1 – Clear

Feasible Distractors:
0 – None
1 – At least 2 but not all
2 – All Feasible

Explanation:
0 – Missing
1 – Inadequate or wrong
2 – Minimal/unclear
3 – Good/Detailed
4 – Excellent (Describes physics thoroughly, remarks on plausibility of answer, use of appropriate diagrams, perhaps explains why you would have obtained distractors)

Quality of Author Comments:
0 – None
1 – Irrelevant
2 – Relevant

Correct:
0 – Obviously Incorrect
1 – Most Likely Correct

Recognised as Incorrect: 
0 – N/A or not recognised
1 – Recognised as incorrect by students (or disagreement with author)

Diagram: 
0 – None
1 – Contextual picture but not relevant
2 – Relevant diagram or picture



Plagiarism:
0 - Potentially Plagiarised
1 – Not obviously plagiarised

Context of Question: 
0 – None (formulas, recalling info)
1 – Irrelevant or extraneous context (entertaining, imaginary)
2 – Physics (frictionless, idealized situation)
3 – Relevant real world context (applicable to daily situations cars on racetracks)

Revised Taxonomy:
1 – Remember, Recognise or Recall OR just plugging in numbers
2 – Understand, Interpret or predict (No calculation needed, understanding 3rd law for example)
3 – Apply, Implement or Calculate (1 step calculation)
4 – Analyse, differentiate or organise (multi-step calculation, higher analysis)
5 – Evaluate, Asses or Rank ( Evaluating various options and assessing their validity)
6 – Create, Combine or Produce (Asked to combine various areas of physics, need to get a structure right to solve whole problem)

This last category maps the question onto levels in the cognitive domain of Bloom's (revised) taxonomy. After doing some tests to establish an acceptable level of inter-rater reliability, we've let the two students loose on their own sets of questions.

They're in progress, but early indications are that in contrast to a recently published study (A participatory learning approach to biochemistry using student authored and evaluated multiple-choice questions, Denny and Bottomley DOI: 10.1002/bmb.20526) we're seeing that only relatively few questions inhabit the lower reaches of this scale, with most in category 3 and 4 and non-negligible numbers in the highest categories. I'll post more results when we have them.

Here's a nice example of a question classified at level 5 in the taxonomy. 



Wednesday, January 25, 2012


Getting started with biology students.

The second year biology (genetics) students are now one week into the course and, after only three lectures, there are already 22 questions mounted on the course PeerWise site. Moreover, all but the two most recent questions have attracted multiple comments. Almost all of the questions involve digestion and re-synthesis of lecture material.  Perhaps this rapid and encouraging uptake is, in part due to better scaffolding provided during introduction of the course as compared with last year. A log of the process and information for setting this task for this class (GGA), including the introductory powerpoint slides are available to view on request.

Thursday, January 19, 2012

Thinking qualitatively

We're beginning to think about how to assess the quality of the questions that students have been submitting in their courses, and we have recruited a couple of final years Honours project students in Physics to help us out.

A starting point is to generate some ideas for what sort of classification scheme we might want to use to be able to classify the questions. We need to bear in mind there are several HUNDRED questions for each course, so the classification has to be coarse enough to be done reasonably quickly, but fine enough to capture the essence of the questions across different dimensions. Here are examples of 2 such dimensions that we might want to consider:

a. A classification based on cognitive challenge of the questions. We might want to think about mapping onto the various levels of Bloom's Taxonomy: knowledge and recall at the bottom of the hierarchy and so on.

b. Some sort of measure of 'physics sophistication'. Is this question a straightforward application of a single physics principle (eg cons of energy)? etc.

There's a recent paper by Paul Denny on this topic that's going to prove to be a useful starting point for us:

http://onlinelibrary.wiley.com.ezproxy.webfeat.lib.ed.ac.uk/doi/10.1002/bmb.20526/abstract

Watch this space.

Thursday, December 1, 2011

PeerWise assessment schemes

The third and final PeerWise assessment task in Physics 1A is now underway. Each of these tasks has been summatively assessed, using algorithms to translate the PeerWise scoreboard score into an assignment mark. This is a trickier process than you might think: there are a quartet of files on the main project wiki which describe a couple of approaches to doing this and give Excel template files implementing each algorithm.

Tuesday, November 22, 2011

Third and final Peerwise exercise in Physics 1A

This week sees the third and final PeerWise exercise in our Physics 1A course. We've collected together some of the material that we used to introduce and scaffold the activities and to provide feedback to students after the assessment activities have finished.


The resources are all available as a self-contained web-extract from our online course notes system.

The timeline is as follows:

  • Week 5: first PeerWise exercise introduced in workshops (the first 3 nodes in the web-extract)
  • Week 6: students work on creating their first questions: PW1 assessment live.
  • Week 7: feedback to students on PW1 outcomes, introduction of second PeerWise task: improving the quality of distracter answers.
  • Week 8: students work on PW2 ("PeerWiser") 
  • Week 10: introduction to PW3 ("PeerWisest"). Creation of questions synthesising more than one topic from the course. 
  • Week 11: students work on PW3

Monday, November 21, 2011

Well, someone liked it ....

Paul Richardson, from the JISC Regional Support Centre Wales, was one of the participants who came to our workshop last week as part of the JISC online conference activity week. He wrote this blog post about his thoughts on the workshop experience: I am glad we managed to offer something a bit more than just a 'sit and listen to us while we tell you what we've done'.....

Actually, Judy and I came up with several ideas for how we could have improved the session.... maybe next time....