Sakzad, A., Paul, D., Sheard, J., Brankovic, L., Skerritt, M. P., Li, N., Minagar, S., Simon, Billingsley, W. “Diverging assessments: What, Why, and Experiences”, SIGCSE 2024: Proceedings of the 55th ACM Technical Symposium on Computer Science Education V. 1, March 2024.
In this experience paper, we introduce the concept of 'diverging assessments', process-based assessments designed so that they become unique for each student while all students see a common skeleton. We present experiences with diverging assessments in the contexts of computer networks, operating systems, ethical hacking, and software development. All the given examples allow the use of generative-AI-based tools, are authentic, and are designed to generate learning opportunities that foster students' meta-cognition. Finally, we reflect upon these experiences in five different courses across four universities, showing how diverging assessments enhance students' learning while respecting academic integrity.