Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator

Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed...

Full description

Bibliographic Details
Main Authors: Rohaida, Romli, Shahida, Sulaiman, Kamal Z., Zamli
Format: Article
Language:English
Published: Elsevier 2015
Subjects:
Online Access:http://umpir.ump.edu.my/id/eprint/11813/
http://umpir.ump.edu.my/id/eprint/11813/
http://umpir.ump.edu.my/id/eprint/11813/
http://umpir.ump.edu.my/id/eprint/11813/1/Improving%20Automated%20Programming%20Assessments-%20User%20Experience%20Evaluation%20Using%20FaSt-generator.pdf
id ump-11813
recordtype eprints
spelling ump-118132018-01-16T02:14:09Z http://umpir.ump.edu.my/id/eprint/11813/ Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator Rohaida, Romli Shahida, Sulaiman Kamal Z., Zamli QA76 Computer software Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed and tested for decades. Basically, the need for decreasing the load of work among lecturers, timely feedback to students and accuracy on the grading results are the common reasons that motivate the need for APAS. In order to carry out a dynamic testing in APA, it is necessary to prepare an appropriate and adequate set of test data to judge the correctness quality of students’ programming solutions in terms of the aspects of functional and/or structural testing. Manual preparation of quality test data becomes a hard, time consuming, and feasible task in the practice of both software testing and APA. Thus, the generation of automated test data is highly desirable to alleviate the humans’ burden from performing repetitive tasks. This paper aims to describe the design, implementation and user experience when evaluating a tool developed to support APA as a test data generator that is called FaSt-generator. The tool plays an important role to furnish a test set that includes an adequate set of test data to execute both the functional and structural testing in APA. Results collected from the conducted user experience evaluation using FaSt-generator reveal that all the subjects had relatively positive opinions and greatly favour the criteria of User Perception and End-User Computing Satisfaction (EUCS). Elsevier 2015 Article PeerReviewed application/pdf en cc_by_nc_nd http://umpir.ump.edu.my/id/eprint/11813/1/Improving%20Automated%20Programming%20Assessments-%20User%20Experience%20Evaluation%20Using%20FaSt-generator.pdf Rohaida, Romli and Shahida, Sulaiman and Kamal Z., Zamli (2015) Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator. Procedia Computer Science, 72. pp. 186-193. ISSN 1877-0509 http://dx.doi.org/10.1016/j.procs.2015.12.120 DOI: 10.1016/j.procs.2015.12.120
repository_type Digital Repository
institution_category Local University
institution Universiti Malaysia Pahang
building UMP Institutional Repository
collection Online Access
language English
topic QA76 Computer software
spellingShingle QA76 Computer software
Rohaida, Romli
Shahida, Sulaiman
Kamal Z., Zamli
Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator
description Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed and tested for decades. Basically, the need for decreasing the load of work among lecturers, timely feedback to students and accuracy on the grading results are the common reasons that motivate the need for APAS. In order to carry out a dynamic testing in APA, it is necessary to prepare an appropriate and adequate set of test data to judge the correctness quality of students’ programming solutions in terms of the aspects of functional and/or structural testing. Manual preparation of quality test data becomes a hard, time consuming, and feasible task in the practice of both software testing and APA. Thus, the generation of automated test data is highly desirable to alleviate the humans’ burden from performing repetitive tasks. This paper aims to describe the design, implementation and user experience when evaluating a tool developed to support APA as a test data generator that is called FaSt-generator. The tool plays an important role to furnish a test set that includes an adequate set of test data to execute both the functional and structural testing in APA. Results collected from the conducted user experience evaluation using FaSt-generator reveal that all the subjects had relatively positive opinions and greatly favour the criteria of User Perception and End-User Computing Satisfaction (EUCS).
format Article
author Rohaida, Romli
Shahida, Sulaiman
Kamal Z., Zamli
author_facet Rohaida, Romli
Shahida, Sulaiman
Kamal Z., Zamli
author_sort Rohaida, Romli
title Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator
title_short Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator
title_full Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator
title_fullStr Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator
title_full_unstemmed Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator
title_sort improving automated programming assessments: user experience evaluation using fast-generator
publisher Elsevier
publishDate 2015
url http://umpir.ump.edu.my/id/eprint/11813/
http://umpir.ump.edu.my/id/eprint/11813/
http://umpir.ump.edu.my/id/eprint/11813/
http://umpir.ump.edu.my/id/eprint/11813/1/Improving%20Automated%20Programming%20Assessments-%20User%20Experience%20Evaluation%20Using%20FaSt-generator.pdf
first_indexed 2023-09-18T22:12:47Z
last_indexed 2023-09-18T22:12:47Z
_version_ 1777415139361816576