liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Assessing Large Project Courses: Model, Activities, and Lessons Learned
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering. (Real-time Systems Laboratory)
KTH Royal Institute of Technology, Stockholm, Sweden; University of California, Berkeley, USA.
Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
2015 (English)In: ACM Transactions on Computing Education, ISSN 1946-6226, E-ISSN 1946-6226, Vol. 15, no 4, 20:1-20:30 p.Article in journal (Refereed) Published
Abstract [en]

In a modern computing curriculum, large project courses are essential to give students hands-on experience of working in a realistic software engineering project. Assessing such projects is, however, extremely challenging. There are various aspects and tradeoffs of assessments that can affect the course quality. Individual assessments can give fair grading of individuals, but may loose focus of the project as a group activity. Extensive teacher involvement is necessary for objective assessment, but may affect the way students are working. Continuous feedback to students can enhance learning, but may be hard to combine with fair assessment. Most previous work is focusing on some specific assessment aspect, whereas we in this paper present an assessment model that consists of a collection of assessment activities, each covering different aspects. We have applied, developed, and improved these activities during a seven-year period. To evaluate the usefulness of the model, we perform questionnaire-based surveys over a two-years period. Furthermore, we design and execute an experiment that studies to what extent students can perform fair peer assessment and to what degree the assessments of students and teachers agree. We analyze the results, discuss findings, and summarize lessons learned.

Place, publisher, year, edition, pages
ACM Special Interest Group on Computer Science Education, 2015. Vol. 15, no 4, 20:1-20:30 p.
National Category
Computer and Information Science
Identifiers
URN: urn:nbn:se:liu:diva-123544DOI: 10.1145/2732156ISI: 000367991400005OAI: oai:DiVA.org:liu-123544DiVA: diva2:886168
Available from: 2015-12-21 Created: 2015-12-21 Last updated: 2017-12-01Bibliographically approved

Open Access in DiVA

fulltext(518 kB)72 downloads
File information
File name FULLTEXT02.pdfFile size 518 kBChecksum SHA-512
cc8af9ae5c5868935e65cc5b9456d16d35e6b8ea03ffdd0f15f909b2fdc9b31930f1ea14956b26d5147f6a876918ecbb5c3059f5b15541f54854e0863b2b72d4
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Authority records BETA

Vasilevskaya, MariaBroman, DavidSandahl, Kristian

Search in DiVA

By author/editor
Vasilevskaya, MariaBroman, DavidSandahl, Kristian
By organisation
Software and SystemsFaculty of Science & Engineering
In the same journal
ACM Transactions on Computing Education
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 72 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 623 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf