Peer review of team marks using a web-based tool: an evaluation
journal contributionposted on 13.07.2010 by Peter Willmot, Adam Crawford
Any type of content formally published in an academic journal, usually following a peer-review process.
Of all the problems associated with student learning in a team situation, the difficulty of fair assessment of the individual team members is supreme. Academics who feel comfortable setting examinations and single-person assignments are deterred from setting team assessments because they fear that idle students may benefit from the efforts of their team-mates or that weaker team members might dilute the efforts of the more diligent. This paper discusses how accurately academics can recreate the rewards for good or bad performance in industry through undergraduate team projects. The arguments for allocating equal team marks are examined but the authors conclude this is not the correct approach. A web-based system for applying peer moderation to team marks is described and accumulated data from it allows peer marks to be compared with anonymous selfassessments. Validation is completed by comparing the peer assessment outcomes with control data supplied by independent mentors that were attached to each student team. The results generate a high level of confidence in the approach. Peer review results for teams were further used to estimate the degree of harmony amongst team members: a high standard deviation in peer marks might indicate conflict, whereas a low standard deviation could be a sign of a harmonious team that one might expect to out-perform the individual potential of its members. Previous academic track record was used as the benchmark for potential success but was found to be a poor predicator of actual achievement in team project work.
- Mechanical, Electrical and Manufacturing Engineering